[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 8238 1726882368.95882: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-AQL executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 8238 1726882368.96404: Added group all to inventory 8238 1726882368.96407: Added group ungrouped to inventory 8238 1726882368.96411: Group all now contains ungrouped 8238 1726882368.96414: Examining possible inventory source: /tmp/network-mVt/inventory.yml 8238 1726882369.27354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 8238 1726882369.27418: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 8238 1726882369.27445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 8238 1726882369.27511: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 8238 1726882369.27674: Loaded config def from plugin (inventory/script) 8238 1726882369.27677: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 8238 1726882369.27717: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 8238 1726882369.27811: Loaded config def from plugin (inventory/yaml) 8238 1726882369.27813: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 8238 1726882369.28211: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 8238 1726882369.29079: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 8238 1726882369.29083: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 8238 1726882369.29086: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 8238 1726882369.29092: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 8238 1726882369.29096: Loading data from /tmp/network-mVt/inventory.yml 8238 1726882369.29373: /tmp/network-mVt/inventory.yml was not parsable by auto 8238 1726882369.29444: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 8238 1726882369.29486: Loading data from /tmp/network-mVt/inventory.yml 8238 1726882369.29778: group all already in inventory 8238 1726882369.29789: set inventory_file for managed_node1 8238 1726882369.30027: set inventory_dir for managed_node1 8238 1726882369.30029: Added host managed_node1 to inventory 8238 1726882369.30032: Added host managed_node1 to group all 8238 1726882369.30033: set ansible_host for managed_node1 8238 1726882369.30034: set ansible_ssh_extra_args for managed_node1 8238 1726882369.30038: set inventory_file for managed_node2 8238 1726882369.30041: set inventory_dir for managed_node2 8238 1726882369.30042: Added host managed_node2 to inventory 8238 1726882369.30044: Added host managed_node2 to group all 8238 1726882369.30045: set ansible_host for managed_node2 8238 1726882369.30046: set ansible_ssh_extra_args for managed_node2 8238 1726882369.30048: set inventory_file for managed_node3 8238 1726882369.30051: set inventory_dir for managed_node3 8238 1726882369.30052: Added host managed_node3 to inventory 8238 1726882369.30053: Added host managed_node3 to group all 8238 1726882369.30054: set ansible_host for managed_node3 8238 1726882369.30055: set ansible_ssh_extra_args for managed_node3 8238 1726882369.30058: Reconcile groups and hosts in inventory. 8238 1726882369.30062: Group ungrouped now contains managed_node1 8238 1726882369.30064: Group ungrouped now contains managed_node2 8238 1726882369.30066: Group ungrouped now contains managed_node3 8238 1726882369.30149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 8238 1726882369.30279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 8238 1726882369.30638: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 8238 1726882369.30669: Loaded config def from plugin (vars/host_group_vars) 8238 1726882369.30671: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 8238 1726882369.30678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 8238 1726882369.30687: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8238 1726882369.30732: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8238 1726882369.31468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882369.31566: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 8238 1726882369.31612: Loaded config def from plugin (connection/local) 8238 1726882369.31615: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8238 1726882369.33389: Loaded config def from plugin (connection/paramiko_ssh) 8238 1726882369.33393: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8238 1726882369.35441: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8238 1726882369.35487: Loaded config def from plugin (connection/psrp) 8238 1726882369.35490: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8238 1726882369.37118: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8238 1726882369.37164: Loaded config def from plugin (connection/ssh) 8238 1726882369.37168: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8238 1726882369.41774: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8238 1726882369.41820: Loaded config def from plugin (connection/winrm) 8238 1726882369.42012: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8238 1726882369.42055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 8238 1726882369.42127: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8238 1726882369.42205: Loaded config def from plugin (shell/cmd) 8238 1726882369.42208: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8238 1726882369.42441: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8238 1726882369.42515: Loaded config def from plugin (shell/powershell) 8238 1726882369.42517: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8238 1726882369.42578: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 8238 1726882369.43185: Loaded config def from plugin (shell/sh) 8238 1726882369.43187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8238 1726882369.43227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 8238 1726882369.43361: Loaded config def from plugin (become/runas) 8238 1726882369.43364: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8238 1726882369.43775: Loaded config def from plugin (become/su) 8238 1726882369.43777: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8238 1726882369.44359: Loaded config def from plugin (become/sudo) 8238 1726882369.44361: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 8238 1726882369.44399: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 8238 1726882369.45064: in VariableManager get_vars() 8238 1726882369.45085: done with get_vars() 8238 1726882369.45323: trying /usr/local/lib/python3.12/site-packages/ansible/modules 8238 1726882369.51879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 8238 1726882369.52233: in VariableManager get_vars() 8238 1726882369.52239: done with get_vars() 8238 1726882369.52242: variable 'playbook_dir' from source: magic vars 8238 1726882369.52243: variable 'ansible_playbook_python' from source: magic vars 8238 1726882369.52244: variable 'ansible_config_file' from source: magic vars 8238 1726882369.52244: variable 'groups' from source: magic vars 8238 1726882369.52245: variable 'omit' from source: magic vars 8238 1726882369.52246: variable 'ansible_version' from source: magic vars 8238 1726882369.52247: variable 'ansible_check_mode' from source: magic vars 8238 1726882369.52247: variable 'ansible_diff_mode' from source: magic vars 8238 1726882369.52248: variable 'ansible_forks' from source: magic vars 8238 1726882369.52249: variable 'ansible_inventory_sources' from source: magic vars 8238 1726882369.52249: variable 'ansible_skip_tags' from source: magic vars 8238 1726882369.52250: variable 'ansible_limit' from source: magic vars 8238 1726882369.52251: variable 'ansible_run_tags' from source: magic vars 8238 1726882369.52252: variable 'ansible_verbosity' from source: magic vars 8238 1726882369.52292: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 8238 1726882369.53932: in VariableManager get_vars() 8238 1726882369.53949: done with get_vars() 8238 1726882369.53959: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 8238 1726882369.56255: in VariableManager get_vars() 8238 1726882369.56271: done with get_vars() 8238 1726882369.56280: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 8238 1726882369.56511: in VariableManager get_vars() 8238 1726882369.56644: done with get_vars() 8238 1726882369.56931: in VariableManager get_vars() 8238 1726882369.56946: done with get_vars() 8238 1726882369.57030: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 8238 1726882369.57273: in VariableManager get_vars() 8238 1726882369.57288: done with get_vars() 8238 1726882369.57811: in VariableManager get_vars() 8238 1726882369.57826: done with get_vars() 8238 1726882369.57831: variable 'omit' from source: magic vars 8238 1726882369.57848: variable 'omit' from source: magic vars 8238 1726882369.57998: in VariableManager get_vars() 8238 1726882369.58011: done with get_vars() 8238 1726882369.58132: in VariableManager get_vars() 8238 1726882369.58145: done with get_vars() 8238 1726882369.58184: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 8238 1726882369.58775: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 8238 1726882369.59028: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 8238 1726882369.60552: in VariableManager get_vars() 8238 1726882369.60578: done with get_vars() 8238 1726882369.61695: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 8238 1726882369.61968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8238 1726882369.65575: in VariableManager get_vars() 8238 1726882369.65596: done with get_vars() 8238 1726882369.65606: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 8238 1726882369.66005: in VariableManager get_vars() 8238 1726882369.66029: done with get_vars() 8238 1726882369.66349: in VariableManager get_vars() 8238 1726882369.66425: done with get_vars() 8238 1726882369.67073: in VariableManager get_vars() 8238 1726882369.67091: done with get_vars() 8238 1726882369.67095: variable 'omit' from source: magic vars 8238 1726882369.67126: variable 'omit' from source: magic vars 8238 1726882369.67268: in VariableManager get_vars() 8238 1726882369.67283: done with get_vars() 8238 1726882369.67305: in VariableManager get_vars() 8238 1726882369.67321: done with get_vars() 8238 1726882369.67475: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 8238 1726882369.67652: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 8238 1726882369.71737: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 8238 1726882369.72566: in VariableManager get_vars() 8238 1726882369.72597: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8238 1726882369.77274: in VariableManager get_vars() 8238 1726882369.77298: done with get_vars() 8238 1726882369.77307: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 8238 1726882369.78439: in VariableManager get_vars() 8238 1726882369.78461: done with get_vars() 8238 1726882369.78517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 8238 1726882369.78533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 8238 1726882369.79081: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 8238 1726882369.79368: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 8238 1726882369.79371: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 8238 1726882369.79406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 8238 1726882369.79838: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8238 1726882369.80410: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 8238 1726882369.80480: Loaded config def from plugin (callback/default) 8238 1726882369.80483: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8238 1726882369.83999: Loaded config def from plugin (callback/junit) 8238 1726882369.84002: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8238 1726882369.84054: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 8238 1726882369.84331: Loaded config def from plugin (callback/minimal) 8238 1726882369.84334: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8238 1726882369.84376: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8238 1726882369.84445: Loaded config def from plugin (callback/tree) 8238 1726882369.84447: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 8238 1726882369.84790: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 8238 1726882369.84792: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 8238 1726882369.84821: in VariableManager get_vars() 8238 1726882369.84838: done with get_vars() 8238 1726882369.84843: in VariableManager get_vars() 8238 1726882369.84851: done with get_vars() 8238 1726882369.84854: variable 'omit' from source: magic vars 8238 1726882369.84889: in VariableManager get_vars() 8238 1726882369.84902: done with get_vars() 8238 1726882369.84921: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 8238 1726882369.86096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 8238 1726882369.86375: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 8238 1726882369.86543: getting the remaining hosts for this loop 8238 1726882369.86545: done getting the remaining hosts for this loop 8238 1726882369.86549: getting the next task for host managed_node3 8238 1726882369.86552: done getting next task for host managed_node3 8238 1726882369.86554: ^ task is: TASK: Gathering Facts 8238 1726882369.86556: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882369.86558: getting variables 8238 1726882369.86559: in VariableManager get_vars() 8238 1726882369.86569: Calling all_inventory to load vars for managed_node3 8238 1726882369.86571: Calling groups_inventory to load vars for managed_node3 8238 1726882369.86573: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882369.86585: Calling all_plugins_play to load vars for managed_node3 8238 1726882369.86595: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882369.86598: Calling groups_plugins_play to load vars for managed_node3 8238 1726882369.86634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882369.86692: done with get_vars() 8238 1726882369.86699: done getting variables 8238 1726882369.86874: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Friday 20 September 2024 21:32:49 -0400 (0:00:00.024) 0:00:00.024 ****** 8238 1726882369.86899: entering _queue_task() for managed_node3/gather_facts 8238 1726882369.86901: Creating lock for gather_facts 8238 1726882369.87678: worker is 1 (out of 1 available) 8238 1726882369.87688: exiting _queue_task() for managed_node3/gather_facts 8238 1726882369.87701: done queuing things up, now waiting for results queue to drain 8238 1726882369.87703: waiting for pending results... 8238 1726882369.88436: running TaskExecutor() for managed_node3/TASK: Gathering Facts 8238 1726882369.88888: in run() - task 0affc7ec-ae25-54bc-d334-0000000000cc 8238 1726882369.88903: variable 'ansible_search_path' from source: unknown 8238 1726882369.89057: calling self._execute() 8238 1726882369.89126: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882369.89132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882369.89142: variable 'omit' from source: magic vars 8238 1726882369.89616: variable 'omit' from source: magic vars 8238 1726882369.89690: variable 'omit' from source: magic vars 8238 1726882369.89730: variable 'omit' from source: magic vars 8238 1726882369.90000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882369.90051: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882369.90072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882369.90090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882369.90537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882369.90569: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882369.90572: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882369.90575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882369.90692: Set connection var ansible_connection to ssh 8238 1726882369.90695: Set connection var ansible_shell_type to sh 8238 1726882369.90700: Set connection var ansible_pipelining to False 8238 1726882369.90706: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882369.90713: Set connection var ansible_timeout to 10 8238 1726882369.90723: Set connection var ansible_shell_executable to /bin/sh 8238 1726882369.91062: variable 'ansible_shell_executable' from source: unknown 8238 1726882369.91065: variable 'ansible_connection' from source: unknown 8238 1726882369.91068: variable 'ansible_module_compression' from source: unknown 8238 1726882369.91070: variable 'ansible_shell_type' from source: unknown 8238 1726882369.91073: variable 'ansible_shell_executable' from source: unknown 8238 1726882369.91076: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882369.91081: variable 'ansible_pipelining' from source: unknown 8238 1726882369.91083: variable 'ansible_timeout' from source: unknown 8238 1726882369.91088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882369.91548: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882369.91555: variable 'omit' from source: magic vars 8238 1726882369.91560: starting attempt loop 8238 1726882369.91563: running the handler 8238 1726882369.91799: variable 'ansible_facts' from source: unknown 8238 1726882369.91820: _low_level_execute_command(): starting 8238 1726882369.91831: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882369.93662: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882369.93865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882369.93981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882369.95738: stdout chunk (state=3): >>>/root <<< 8238 1726882369.96140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882369.96143: stdout chunk (state=3): >>><<< 8238 1726882369.96149: stderr chunk (state=3): >>><<< 8238 1726882369.96152: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882369.96158: _low_level_execute_command(): starting 8238 1726882369.96161: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848 `" && echo ansible-tmp-1726882369.9606643-8284-278810064524848="` echo /root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848 `" ) && sleep 0' 8238 1726882369.97439: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882369.97458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882369.97472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882369.97490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882369.97511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882369.97629: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882369.97721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882369.97876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882369.97966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882370.01667: stdout chunk (state=3): >>>ansible-tmp-1726882369.9606643-8284-278810064524848=/root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848 <<< 8238 1726882370.02229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882370.02233: stdout chunk (state=3): >>><<< 8238 1726882370.02236: stderr chunk (state=3): >>><<< 8238 1726882370.02239: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882369.9606643-8284-278810064524848=/root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882370.02242: variable 'ansible_module_compression' from source: unknown 8238 1726882370.02247: ANSIBALLZ: Using generic lock for ansible.legacy.setup 8238 1726882370.02250: ANSIBALLZ: Acquiring lock 8238 1726882370.02252: ANSIBALLZ: Lock acquired: 140036204254016 8238 1726882370.02254: ANSIBALLZ: Creating module 8238 1726882370.60114: ANSIBALLZ: Writing module into payload 8238 1726882370.60469: ANSIBALLZ: Writing module 8238 1726882370.60719: ANSIBALLZ: Renaming module 8238 1726882370.60725: ANSIBALLZ: Done creating module 8238 1726882370.60728: variable 'ansible_facts' from source: unknown 8238 1726882370.60731: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882370.60733: _low_level_execute_command(): starting 8238 1726882370.60735: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 8238 1726882370.63768: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882370.63774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882370.63776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882370.63795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882370.63842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882370.63846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882370.64068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882370.64179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882370.66521: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 8238 1726882370.66527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882370.66722: stderr chunk (state=3): >>><<< 8238 1726882370.66728: stdout chunk (state=3): >>><<< 8238 1726882370.66731: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882370.66737 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 8238 1726882370.66740: _low_level_execute_command(): starting 8238 1726882370.66743: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 8238 1726882370.67558: Sending initial data 8238 1726882370.67562: Sent initial data (1181 bytes) 8238 1726882370.68938: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882370.68942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882370.69236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882370.69453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882370.69669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882370.73498: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 8238 1726882370.73690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882370.73918: stderr chunk (state=3): >>><<< 8238 1726882370.73921: stdout chunk (state=3): >>><<< 8238 1726882370.73926: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882370.74115: variable 'ansible_facts' from source: unknown 8238 1726882370.74118: variable 'ansible_facts' from source: unknown 8238 1726882370.74136: variable 'ansible_module_compression' from source: unknown 8238 1726882370.74179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8238 1726882370.74324: variable 'ansible_facts' from source: unknown 8238 1726882370.74697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/AnsiballZ_setup.py 8238 1726882370.75135: Sending initial data 8238 1726882370.75142: Sent initial data (152 bytes) 8238 1726882370.76777: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882370.76785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882370.76811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882370.76827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882370.77102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882370.78709: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882370.78812: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882370.78991: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpn45rixtm /root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/AnsiballZ_setup.py <<< 8238 1726882370.78996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/AnsiballZ_setup.py" <<< 8238 1726882370.79146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpn45rixtm" to remote "/root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/AnsiballZ_setup.py" <<< 8238 1726882370.82959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882370.82963: stdout chunk (state=3): >>><<< 8238 1726882370.82969: stderr chunk (state=3): >>><<< 8238 1726882370.82996: done transferring module to remote 8238 1726882370.83013: _low_level_execute_command(): starting 8238 1726882370.83018: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/ /root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/AnsiballZ_setup.py && sleep 0' 8238 1726882370.84422: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882370.84428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882370.84431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882370.84438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882370.84440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882370.84443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882370.84796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882370.84824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882370.86719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882370.86803: stderr chunk (state=3): >>><<< 8238 1726882370.86851: stdout chunk (state=3): >>><<< 8238 1726882370.86937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882370.86954: _low_level_execute_command(): starting 8238 1726882370.86965: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/AnsiballZ_setup.py && sleep 0' 8238 1726882370.88424: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882370.88559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882370.88578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882370.88596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882370.88740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882370.91118: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 8238 1726882370.91140: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 8238 1726882370.91301: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 8238 1726882370.91319: stdout chunk (state=3): >>>import 'time' # <<< 8238 1726882370.91337: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 8238 1726882370.91530: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882370.91535: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d7fc530> <<< 8238 1726882370.91541: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d7cbb30> <<< 8238 1726882370.91561: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 8238 1726882370.91566: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d7feab0> <<< 8238 1726882370.91585: stdout chunk (state=3): >>>import '_signal' # <<< 8238 1726882370.91619: stdout chunk (state=3): >>>import '_abc' # <<< 8238 1726882370.91634: stdout chunk (state=3): >>>import 'abc' # <<< 8238 1726882370.91645: stdout chunk (state=3): >>>import 'io' # <<< 8238 1726882370.91734: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 8238 1726882370.91818: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 8238 1726882370.91872: stdout chunk (state=3): >>>import 'os' # <<< 8238 1726882370.91876: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 8238 1726882370.91878: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 8238 1726882370.91880: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 8238 1726882370.91941: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 8238 1726882370.91963: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d5f11c0> <<< 8238 1726882370.92021: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 8238 1726882370.92132: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d5f2000> <<< 8238 1726882370.92146: stdout chunk (state=3): >>>import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 8238 1726882370.92509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 8238 1726882370.92525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 8238 1726882370.92540: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 8238 1726882370.92557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882370.92611: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 8238 1726882370.92624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 8238 1726882370.92700: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 8238 1726882370.92717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62fe30> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 8238 1726882370.92745: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62fef0> <<< 8238 1726882370.92770: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 8238 1726882370.92830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 8238 1726882370.92875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882370.92930: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d667800> <<< 8238 1726882370.93050: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d667e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d647b00> <<< 8238 1726882370.93054: stdout chunk (state=3): >>>import '_functools' # <<< 8238 1726882370.93105: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d645220> <<< 8238 1726882370.93217: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62cfe0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 8238 1726882370.93327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 8238 1726882370.93372: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d68b7a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d68a3c0> <<< 8238 1726882370.93443: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d647e60> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d688b60> <<< 8238 1726882370.93543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6b87d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62c260> <<< 8238 1726882370.93547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 8238 1726882370.93753: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6b8c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6b8b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6b8f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62ad80> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6b9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6b92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 8238 1726882370.93774: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6ba510> <<< 8238 1726882370.94437: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6d4740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6d5e80> <<< 8238 1726882370.94454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6d6d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6d7380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6d6270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882370.94458: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6d7da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6d74d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6ba570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d41bcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 8238 1726882370.94464: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d4447d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d444530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d444800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d4449e0> <<< 8238 1726882370.94467: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d419e50> <<< 8238 1726882370.94562: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 8238 1726882370.94644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 8238 1726882370.94650: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 8238 1726882370.94662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 8238 1726882370.94666: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d446090> <<< 8238 1726882370.94669: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d444d10> <<< 8238 1726882370.94676: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6bac60> <<< 8238 1726882370.94704: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 8238 1726882370.94772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882370.94780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 8238 1726882370.94936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 8238 1726882370.95073: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d472420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d48a5a0> <<< 8238 1726882370.95077: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 8238 1726882370.95080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 8238 1726882370.95162: stdout chunk (state=3): >>>import 'ntpath' # <<< 8238 1726882370.95182: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d4bf2f0> <<< 8238 1726882370.95185: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 8238 1726882370.95284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 8238 1726882370.95301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 8238 1726882370.95640: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d4e5a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d4bf410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d48b230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d2c4470> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d4895e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d446fc0> <<< 8238 1726882370.96048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2b5d2c46e0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_juqv10va/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 8238 1726882370.96092: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882370.96121: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 8238 1726882370.96138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 8238 1726882370.96177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 8238 1726882370.96262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 8238 1726882370.96326: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d32e180> <<< 8238 1726882370.96339: stdout chunk (state=3): >>>import '_typing' # <<< 8238 1726882370.96625: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d305070> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d3041d0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 8238 1726882370.98343: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882370.99587: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d307590> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 8238 1726882370.99590: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 8238 1726882370.99683: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d35db50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d35d8e0> <<< 8238 1726882370.99730: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d35d1f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 8238 1726882370.99765: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d35dc40> <<< 8238 1726882370.99844: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d32ee10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d35e8d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882370.99860: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d35eb10> <<< 8238 1726882370.99873: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 8238 1726882371.00024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d35eff0> import 'pwd' # <<< 8238 1726882371.00028: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 8238 1726882371.00118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c4d40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d1c6960> <<< 8238 1726882371.00365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c7320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c8500> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 8238 1726882371.00373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 8238 1726882371.00423: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1caf60> <<< 8238 1726882371.00471: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d1cb080> <<< 8238 1726882371.00485: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c9220> <<< 8238 1726882371.00587: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 8238 1726882371.00962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1cef00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1cd9d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1cd730> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1cfa10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c9730> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d212fc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d213170> <<< 8238 1726882371.00974: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 8238 1726882371.01008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 8238 1726882371.01026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 8238 1726882371.01244: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d218d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d218b30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 8238 1726882371.01272: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d21b260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d2193a0> <<< 8238 1726882371.01381: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 8238 1726882371.01397: stdout chunk (state=3): >>>import '_string' # <<< 8238 1726882371.01536: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d222a50> <<< 8238 1726882371.01613: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d21b3e0> <<< 8238 1726882371.01763: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d223770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d223ad0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d223da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d213470> <<< 8238 1726882371.01782: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 8238 1726882371.01809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 8238 1726882371.01837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 8238 1726882371.01858: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882371.01951: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d227530> <<< 8238 1726882371.02075: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882371.02083: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d2284a0> <<< 8238 1726882371.02095: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d225ca0> <<< 8238 1726882371.02119: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d227020> <<< 8238 1726882371.02268: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d225910> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 8238 1726882371.02272: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.02367: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.02448: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 8238 1726882371.02584: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.02895: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.03338: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.03943: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 8238 1726882371.04005: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882371.04109: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d0b0650> <<< 8238 1726882371.04335: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0b1460> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d224080> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 8238 1726882371.04541: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.04803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0b1be0> # zipimport: zlib available <<< 8238 1726882371.05437: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.05706: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.05942: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.05961: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 8238 1726882371.05973: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.06043: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.06141: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 8238 1726882371.06159: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.06244: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.06269: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 8238 1726882371.06361: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.06542: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.07346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0b3ef0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882371.07383: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d0ba060> <<< 8238 1726882371.07443: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d0ba9c0> <<< 8238 1726882371.07450: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0b2de0> <<< 8238 1726882371.07474: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.07515: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.07572: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 8238 1726882371.07575: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.07898: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.07918: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882371.07939: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d0b9670> <<< 8238 1726882371.07975: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0baba0> <<< 8238 1726882371.08014: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 8238 1726882371.08129: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.08139: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.08171: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.08262: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882371.08274: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 8238 1726882371.08291: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 8238 1726882371.08344: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 8238 1726882371.08640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d14ed50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0c4b90> <<< 8238 1726882371.08738: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0bec30> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0be990> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 8238 1726882371.08755: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.08809: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 8238 1726882371.08895: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 8238 1726882371.08926: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.08930: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 8238 1726882371.08943: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.09106: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.09165: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.09436: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # <<< 8238 1726882371.09447: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.09580: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.09710: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.09762: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.09794: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 8238 1726882371.09810: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.10199: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.10512: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.10558: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.10657: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 8238 1726882371.10672: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 8238 1726882371.10706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 8238 1726882371.10763: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d155970> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 8238 1726882371.10800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 8238 1726882371.10880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 8238 1726882371.10941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 8238 1726882371.11095: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c6c03b0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c6c06e0> <<< 8238 1726882371.11110: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d135430> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d134860> <<< 8238 1726882371.11161: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d154140> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d157a40> <<< 8238 1726882371.11292: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 8238 1726882371.11509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c6c3680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c6c2f30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c6c3110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c6c23c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 8238 1726882371.11585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c6c37d0> <<< 8238 1726882371.11624: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 8238 1726882371.11724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 8238 1726882371.11801: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c72a2d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c7282f0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d157cb0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 8238 1726882371.11832: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.11877: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.11932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 8238 1726882371.11956: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.12054: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.12067: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 8238 1726882371.12092: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 8238 1726882371.12149: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.12269: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 8238 1726882371.12272: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.12296: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.12314: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 8238 1726882371.12362: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.12451: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 8238 1726882371.12463: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.12485: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.12547: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.12608: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.12717: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 8238 1726882371.12721: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.13636: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.14227: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 8238 1726882371.14256: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.14326: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.14448: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.14468: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.14561: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 8238 1726882371.14607: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 8238 1726882371.14809: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 8238 1726882371.14812: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.14888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 8238 1726882371.14969: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.15001: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 8238 1726882371.15041: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.15099: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 8238 1726882371.15166: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.15234: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.15379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 8238 1726882371.15390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 8238 1726882371.15423: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c72b980> <<< 8238 1726882371.15477: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 8238 1726882371.15498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 8238 1726882371.15729: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c72b050> import 'ansible.module_utils.facts.system.local' # <<< 8238 1726882371.15734: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.15848: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.16009: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 8238 1726882371.16110: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.16207: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.16368: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 8238 1726882371.16738: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.16761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882371.16819: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c75a510> <<< 8238 1726882371.17069: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c746330> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 8238 1726882371.17174: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # <<< 8238 1726882371.17200: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.17277: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.17420: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.17533: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.17656: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 8238 1726882371.17761: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 8238 1726882371.17815: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.17870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 8238 1726882371.17899: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882371.18067: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c569fd0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c56b830> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 8238 1726882371.18138: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 8238 1726882371.18300: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.18550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.18657: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.18736: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # <<< 8238 1726882371.18781: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 8238 1726882371.19027: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.19124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 8238 1726882371.19142: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.19269: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.19410: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 8238 1726882371.19449: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.19486: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.20170: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.20954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 8238 1726882371.21063: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.21179: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 8238 1726882371.21197: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.21359: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.21525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 8238 1726882371.21564: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 8238 1726882371.21585: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.21611: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.21669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 8238 1726882371.21924: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.22120: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.22364: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 8238 1726882371.22368: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 8238 1726882371.22392: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.22442: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 8238 1726882371.22494: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 8238 1726882371.22706: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.22710: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.22734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 8238 1726882371.22810: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.23004: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # <<< 8238 1726882371.23125: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.23358: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.23601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 8238 1726882371.23621: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.23679: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.23795: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.23827: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 8238 1726882371.23848: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24010: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.24014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 8238 1726882371.24016: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24232: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24235: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 8238 1726882371.24238: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 8238 1726882371.24240: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24275: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24362: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 8238 1726882371.24365: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.24440: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24443: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24559: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24576: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24685: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 8238 1726882371.24688: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24725: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.24789: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 8238 1726882371.25366: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 8238 1726882371.25370: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 8238 1726882371.25399: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.25456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 8238 1726882371.25481: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.25756: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882371.25862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 8238 1726882371.25950: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882371.26760: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 8238 1726882371.26785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 8238 1726882371.27024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c592ae0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c590cb0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c592540> <<< 8238 1726882372.78064: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c5db410> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c5d8bf0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 8238 1726882372.78102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 8238 1726882372.78119: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c5da360> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c5d9df0> <<< 8238 1726882372.78447: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 8238 1726882373.03067: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3115, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 601, "free": 3115}, "nocache": {"free": 3490, "used": 226}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version<<< 8238 1726882373.03109: stdout chunk (state=3): >>>": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 517, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251394297856, "block_size": 4096, "block_total": 64483404, "block_available": 61375561, "block_used": 3107843, "inode_total": 16384000, "inode_available": 16303147, "inode_used": 80853, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_is_chroot": false, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "52", "epoch": "1726882372", "epoch_int": "1726882372", "date": "2024-09-20", "time": "21:32:52", "iso8601_micro": "2024-09-21T01:32:52.983351Z", "iso8601": "2024-09-21T01:32:52Z", "iso8601_basic": "20240920T213252983351", "iso8601_basic_short": "20240920T213252", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_loadavg": {"1m": 0.6240234375, "5m": 0.3310546875, "15m": 0.15185546875}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8238 1726882373.03910: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 8238 1726882373.04064: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib <<< 8238 1726882373.04077: stdout chunk (state=3): >>># cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc <<< 8238 1726882373.04303: stdout chunk (state=3): >>># cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing<<< 8238 1726882373.04373: stdout chunk (state=3): >>> # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly <<< 8238 1726882373.04464: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network<<< 8238 1726882373.04468: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin <<< 8238 1726882373.04539: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual <<< 8238 1726882373.04542: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl <<< 8238 1726882373.04560: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux<<< 8238 1726882373.04709: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 8238 1726882373.05062: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 8238 1726882373.05073: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 8238 1726882373.05125: stdout chunk (state=3): >>># destroy _bz2 <<< 8238 1726882373.05173: stdout chunk (state=3): >>># destroy _compression <<< 8238 1726882373.05244: stdout chunk (state=3): >>># destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 8238 1726882373.05268: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport <<< 8238 1726882373.05367: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 8238 1726882373.05396: stdout chunk (state=3): >>># destroy grp <<< 8238 1726882373.05489: stdout chunk (state=3): >>># destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 8238 1726882373.05530: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 8238 1726882373.05664: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 8238 1726882373.05746: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 8238 1726882373.05750: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 8238 1726882373.06008: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 8238 1726882373.06028: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 8238 1726882373.06207: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 8238 1726882373.06304: stdout chunk (state=3): >>># destroy sys.monitoring <<< 8238 1726882373.06452: stdout chunk (state=3): >>># destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 8238 1726882373.06486: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 8238 1726882373.06489: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 8238 1726882373.06531: stdout chunk (state=3): >>># destroy _typing <<< 8238 1726882373.06607: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 8238 1726882373.06871: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 8238 1726882373.07012: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 8238 1726882373.07556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882373.07657: stderr chunk (state=3): >>><<< 8238 1726882373.07725: stdout chunk (state=3): >>><<< 8238 1726882373.07936: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d7fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d7cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d7feab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d5f11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d5f2000> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62fe30> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62fef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d667800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d667e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d647b00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d645220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62cfe0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d68b7a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d68a3c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d647e60> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d688b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6b87d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62c260> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6b8c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6b8b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6b8f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d62ad80> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6b9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6b92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6ba510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6d4740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6d5e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6d6d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6d7380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6d6270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d6d7da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6d74d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6ba570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d41bcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d4447d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d444530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d444800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d4449e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d419e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d446090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d444d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d6bac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d472420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d48a5a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d4bf2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d4e5a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d4bf410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d48b230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d2c4470> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d4895e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d446fc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2b5d2c46e0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_juqv10va/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d32e180> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d305070> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d3041d0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d307590> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d35db50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d35d8e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d35d1f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d35dc40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d32ee10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d35e8d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d35eb10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d35eff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c4d40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d1c6960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c7320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c8500> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1caf60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d1cb080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c9220> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1cef00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1cd9d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1cd730> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1cfa10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d1c9730> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d212fc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d213170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d218d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d218b30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d21b260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d2193a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d222a50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d21b3e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d223770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d223ad0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d223da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d213470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d227530> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d2284a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d225ca0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d227020> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d225910> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d0b0650> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0b1460> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d224080> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0b1be0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0b3ef0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d0ba060> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d0ba9c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0b2de0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5d0b9670> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0baba0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d14ed50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0c4b90> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0bec30> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d0be990> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d155970> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c6c03b0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c6c06e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d135430> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d134860> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d154140> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d157a40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c6c3680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c6c2f30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c6c3110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c6c23c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c6c37d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c72a2d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c7282f0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5d157cb0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c72b980> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c72b050> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c75a510> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c746330> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c569fd0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c56b830> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b5c592ae0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c590cb0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c592540> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c5db410> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c5d8bf0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c5da360> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b5c5d9df0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3115, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 601, "free": 3115}, "nocache": {"free": 3490, "used": 226}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 517, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251394297856, "block_size": 4096, "block_total": 64483404, "block_available": 61375561, "block_used": 3107843, "inode_total": 16384000, "inode_available": 16303147, "inode_used": 80853, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_is_chroot": false, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "52", "epoch": "1726882372", "epoch_int": "1726882372", "date": "2024-09-20", "time": "21:32:52", "iso8601_micro": "2024-09-21T01:32:52.983351Z", "iso8601": "2024-09-21T01:32:52Z", "iso8601_basic": "20240920T213252983351", "iso8601_basic_short": "20240920T213252", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_loadavg": {"1m": 0.6240234375, "5m": 0.3310546875, "15m": 0.15185546875}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 8238 1726882373.11061: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882373.11108: _low_level_execute_command(): starting 8238 1726882373.11118: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882369.9606643-8284-278810064524848/ > /dev/null 2>&1 && sleep 0' 8238 1726882373.11845: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882373.11865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882373.11882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882373.11936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882373.12018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882373.12040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882373.12061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882373.12165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882373.15066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882373.15233: stderr chunk (state=3): >>><<< 8238 1726882373.15236: stdout chunk (state=3): >>><<< 8238 1726882373.15273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882373.15428: handler run complete 8238 1726882373.15508: variable 'ansible_facts' from source: unknown 8238 1726882373.15724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882373.16310: variable 'ansible_facts' from source: unknown 8238 1726882373.16446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882373.16607: attempt loop complete, returning result 8238 1726882373.16647: _execute() done 8238 1726882373.16651: dumping result to json 8238 1726882373.16664: done dumping result, returning 8238 1726882373.16685: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-54bc-d334-0000000000cc] 8238 1726882373.16695: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000cc 8238 1726882373.17016: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000cc 8238 1726882373.17929: WORKER PROCESS EXITING ok: [managed_node3] 8238 1726882373.18084: no more pending results, returning what we have 8238 1726882373.18094: results queue empty 8238 1726882373.18119: checking for any_errors_fatal 8238 1726882373.18123: done checking for any_errors_fatal 8238 1726882373.18124: checking for max_fail_percentage 8238 1726882373.18126: done checking for max_fail_percentage 8238 1726882373.18127: checking to see if all hosts have failed and the running result is not ok 8238 1726882373.18128: done checking to see if all hosts have failed 8238 1726882373.18129: getting the remaining hosts for this loop 8238 1726882373.18130: done getting the remaining hosts for this loop 8238 1726882373.18134: getting the next task for host managed_node3 8238 1726882373.18141: done getting next task for host managed_node3 8238 1726882373.18143: ^ task is: TASK: meta (flush_handlers) 8238 1726882373.18145: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882373.18149: getting variables 8238 1726882373.18151: in VariableManager get_vars() 8238 1726882373.18173: Calling all_inventory to load vars for managed_node3 8238 1726882373.18176: Calling groups_inventory to load vars for managed_node3 8238 1726882373.18179: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882373.18189: Calling all_plugins_play to load vars for managed_node3 8238 1726882373.18192: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882373.18195: Calling groups_plugins_play to load vars for managed_node3 8238 1726882373.18467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882373.18688: done with get_vars() 8238 1726882373.18699: done getting variables 8238 1726882373.18785: in VariableManager get_vars() 8238 1726882373.18794: Calling all_inventory to load vars for managed_node3 8238 1726882373.18797: Calling groups_inventory to load vars for managed_node3 8238 1726882373.18800: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882373.18805: Calling all_plugins_play to load vars for managed_node3 8238 1726882373.18807: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882373.18810: Calling groups_plugins_play to load vars for managed_node3 8238 1726882373.19009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882373.19233: done with get_vars() 8238 1726882373.19248: done queuing things up, now waiting for results queue to drain 8238 1726882373.19250: results queue empty 8238 1726882373.19251: checking for any_errors_fatal 8238 1726882373.19253: done checking for any_errors_fatal 8238 1726882373.19254: checking for max_fail_percentage 8238 1726882373.19260: done checking for max_fail_percentage 8238 1726882373.19260: checking to see if all hosts have failed and the running result is not ok 8238 1726882373.19261: done checking to see if all hosts have failed 8238 1726882373.19262: getting the remaining hosts for this loop 8238 1726882373.19263: done getting the remaining hosts for this loop 8238 1726882373.19266: getting the next task for host managed_node3 8238 1726882373.19271: done getting next task for host managed_node3 8238 1726882373.19273: ^ task is: TASK: Include the task 'el_repo_setup.yml' 8238 1726882373.19274: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882373.19277: getting variables 8238 1726882373.19278: in VariableManager get_vars() 8238 1726882373.19292: Calling all_inventory to load vars for managed_node3 8238 1726882373.19294: Calling groups_inventory to load vars for managed_node3 8238 1726882373.19301: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882373.19306: Calling all_plugins_play to load vars for managed_node3 8238 1726882373.19313: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882373.19341: Calling groups_plugins_play to load vars for managed_node3 8238 1726882373.19589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882373.19869: done with get_vars() 8238 1726882373.19880: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Friday 20 September 2024 21:32:53 -0400 (0:00:03.330) 0:00:03.354 ****** 8238 1726882373.19992: entering _queue_task() for managed_node3/include_tasks 8238 1726882373.19994: Creating lock for include_tasks 8238 1726882373.20575: worker is 1 (out of 1 available) 8238 1726882373.20588: exiting _queue_task() for managed_node3/include_tasks 8238 1726882373.20599: done queuing things up, now waiting for results queue to drain 8238 1726882373.20601: waiting for pending results... 8238 1726882373.20837: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 8238 1726882373.21011: in run() - task 0affc7ec-ae25-54bc-d334-000000000006 8238 1726882373.21210: variable 'ansible_search_path' from source: unknown 8238 1726882373.21215: calling self._execute() 8238 1726882373.21260: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882373.21300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882373.21344: variable 'omit' from source: magic vars 8238 1726882373.21544: _execute() done 8238 1726882373.21617: dumping result to json 8238 1726882373.21651: done dumping result, returning 8238 1726882373.21655: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0affc7ec-ae25-54bc-d334-000000000006] 8238 1726882373.21661: sending task result for task 0affc7ec-ae25-54bc-d334-000000000006 8238 1726882373.21996: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000006 8238 1726882373.22000: WORKER PROCESS EXITING 8238 1726882373.22051: no more pending results, returning what we have 8238 1726882373.22057: in VariableManager get_vars() 8238 1726882373.22099: Calling all_inventory to load vars for managed_node3 8238 1726882373.22103: Calling groups_inventory to load vars for managed_node3 8238 1726882373.22107: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882373.22126: Calling all_plugins_play to load vars for managed_node3 8238 1726882373.22130: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882373.22134: Calling groups_plugins_play to load vars for managed_node3 8238 1726882373.22702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882373.23043: done with get_vars() 8238 1726882373.23052: variable 'ansible_search_path' from source: unknown 8238 1726882373.23074: we have included files to process 8238 1726882373.23075: generating all_blocks data 8238 1726882373.23077: done generating all_blocks data 8238 1726882373.23078: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 8238 1726882373.23079: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 8238 1726882373.23082: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 8238 1726882373.24539: in VariableManager get_vars() 8238 1726882373.24562: done with get_vars() 8238 1726882373.24577: done processing included file 8238 1726882373.24579: iterating over new_blocks loaded from include file 8238 1726882373.24580: in VariableManager get_vars() 8238 1726882373.24591: done with get_vars() 8238 1726882373.24593: filtering new block on tags 8238 1726882373.24616: done filtering new block on tags 8238 1726882373.24620: in VariableManager get_vars() 8238 1726882373.24636: done with get_vars() 8238 1726882373.24638: filtering new block on tags 8238 1726882373.24660: done filtering new block on tags 8238 1726882373.24686: in VariableManager get_vars() 8238 1726882373.24699: done with get_vars() 8238 1726882373.24701: filtering new block on tags 8238 1726882373.24723: done filtering new block on tags 8238 1726882373.24726: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 8238 1726882373.24740: extending task lists for all hosts with included blocks 8238 1726882373.24860: done extending task lists 8238 1726882373.24861: done processing included files 8238 1726882373.24862: results queue empty 8238 1726882373.24863: checking for any_errors_fatal 8238 1726882373.24865: done checking for any_errors_fatal 8238 1726882373.24865: checking for max_fail_percentage 8238 1726882373.24866: done checking for max_fail_percentage 8238 1726882373.24867: checking to see if all hosts have failed and the running result is not ok 8238 1726882373.24868: done checking to see if all hosts have failed 8238 1726882373.24869: getting the remaining hosts for this loop 8238 1726882373.24887: done getting the remaining hosts for this loop 8238 1726882373.24913: getting the next task for host managed_node3 8238 1726882373.24918: done getting next task for host managed_node3 8238 1726882373.24937: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 8238 1726882373.24941: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882373.24943: getting variables 8238 1726882373.24944: in VariableManager get_vars() 8238 1726882373.24964: Calling all_inventory to load vars for managed_node3 8238 1726882373.24966: Calling groups_inventory to load vars for managed_node3 8238 1726882373.24969: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882373.24975: Calling all_plugins_play to load vars for managed_node3 8238 1726882373.24977: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882373.25017: Calling groups_plugins_play to load vars for managed_node3 8238 1726882373.25248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882373.25499: done with get_vars() 8238 1726882373.25508: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:32:53 -0400 (0:00:00.055) 0:00:03.410 ****** 8238 1726882373.25589: entering _queue_task() for managed_node3/setup 8238 1726882373.25998: worker is 1 (out of 1 available) 8238 1726882373.26016: exiting _queue_task() for managed_node3/setup 8238 1726882373.26031: done queuing things up, now waiting for results queue to drain 8238 1726882373.26033: waiting for pending results... 8238 1726882373.26552: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 8238 1726882373.26570: in run() - task 0affc7ec-ae25-54bc-d334-0000000000dd 8238 1726882373.26573: variable 'ansible_search_path' from source: unknown 8238 1726882373.26576: variable 'ansible_search_path' from source: unknown 8238 1726882373.26654: calling self._execute() 8238 1726882373.26934: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882373.26986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882373.26990: variable 'omit' from source: magic vars 8238 1726882373.28024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882373.31190: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882373.31287: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882373.31359: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882373.31409: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882373.31443: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882373.31539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882373.31577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882373.31615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882373.31827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882373.31831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882373.31877: variable 'ansible_facts' from source: unknown 8238 1726882373.32007: variable 'network_test_required_facts' from source: task vars 8238 1726882373.32078: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 8238 1726882373.32090: variable 'omit' from source: magic vars 8238 1726882373.32143: variable 'omit' from source: magic vars 8238 1726882373.32198: variable 'omit' from source: magic vars 8238 1726882373.32250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882373.32307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882373.32332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882373.32393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882373.32410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882373.32449: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882373.32459: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882373.32467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882373.32612: Set connection var ansible_connection to ssh 8238 1726882373.32621: Set connection var ansible_shell_type to sh 8238 1726882373.32708: Set connection var ansible_pipelining to False 8238 1726882373.32712: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882373.32715: Set connection var ansible_timeout to 10 8238 1726882373.32721: Set connection var ansible_shell_executable to /bin/sh 8238 1726882373.32728: variable 'ansible_shell_executable' from source: unknown 8238 1726882373.32731: variable 'ansible_connection' from source: unknown 8238 1726882373.32734: variable 'ansible_module_compression' from source: unknown 8238 1726882373.32736: variable 'ansible_shell_type' from source: unknown 8238 1726882373.32738: variable 'ansible_shell_executable' from source: unknown 8238 1726882373.32743: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882373.32774: variable 'ansible_pipelining' from source: unknown 8238 1726882373.32783: variable 'ansible_timeout' from source: unknown 8238 1726882373.32922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882373.33032: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882373.33051: variable 'omit' from source: magic vars 8238 1726882373.33080: starting attempt loop 8238 1726882373.33088: running the handler 8238 1726882373.33108: _low_level_execute_command(): starting 8238 1726882373.33120: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882373.34327: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882373.34384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882373.34473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882373.37029: stdout chunk (state=3): >>>/root <<< 8238 1726882373.37139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882373.37143: stdout chunk (state=3): >>><<< 8238 1726882373.37153: stderr chunk (state=3): >>><<< 8238 1726882373.37357: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882373.37369: _low_level_execute_command(): starting 8238 1726882373.37376: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120 `" && echo ansible-tmp-1726882373.3735702-8426-221580983196120="` echo /root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120 `" ) && sleep 0' 8238 1726882373.38158: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882373.38174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882373.38190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882373.38209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882373.38305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882373.38328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882373.38349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882373.38471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882373.41388: stdout chunk (state=3): >>>ansible-tmp-1726882373.3735702-8426-221580983196120=/root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120 <<< 8238 1726882373.41661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882373.41665: stdout chunk (state=3): >>><<< 8238 1726882373.41668: stderr chunk (state=3): >>><<< 8238 1726882373.41735: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882373.3735702-8426-221580983196120=/root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882373.41774: variable 'ansible_module_compression' from source: unknown 8238 1726882373.41841: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8238 1726882373.42027: variable 'ansible_facts' from source: unknown 8238 1726882373.42164: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/AnsiballZ_setup.py 8238 1726882373.42390: Sending initial data 8238 1726882373.42437: Sent initial data (152 bytes) 8238 1726882373.43208: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882373.43269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882373.43350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882373.43395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882373.43517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882373.45927: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882373.46015: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882373.46121: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpl9cx2ixl /root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/AnsiballZ_setup.py <<< 8238 1726882373.46126: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/AnsiballZ_setup.py" <<< 8238 1726882373.46219: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpl9cx2ixl" to remote "/root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/AnsiballZ_setup.py" <<< 8238 1726882373.48340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882373.48344: stdout chunk (state=3): >>><<< 8238 1726882373.48346: stderr chunk (state=3): >>><<< 8238 1726882373.48349: done transferring module to remote 8238 1726882373.48351: _low_level_execute_command(): starting 8238 1726882373.48353: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/ /root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/AnsiballZ_setup.py && sleep 0' 8238 1726882373.49047: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882373.49115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882373.49190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882373.49234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882373.49248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882373.49362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882373.52255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882373.52260: stdout chunk (state=3): >>><<< 8238 1726882373.52263: stderr chunk (state=3): >>><<< 8238 1726882373.52283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882373.52295: _low_level_execute_command(): starting 8238 1726882373.52306: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/AnsiballZ_setup.py && sleep 0' 8238 1726882373.52975: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882373.52997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882373.53005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882373.53033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882373.53055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882373.53068: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882373.53087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882373.53137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882373.53195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882373.53215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882373.53249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882373.53376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882373.56819: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 8238 1726882373.56870: stdout chunk (state=3): >>>import '_thread' # <<< 8238 1726882373.56878: stdout chunk (state=3): >>>import '_warnings' # <<< 8238 1726882373.56891: stdout chunk (state=3): >>>import '_weakref' # <<< 8238 1726882373.57018: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 8238 1726882373.57078: stdout chunk (state=3): >>> import 'posix' # <<< 8238 1726882373.57134: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 8238 1726882373.57160: stdout chunk (state=3): >>># installing zipimport hook <<< 8238 1726882373.57180: stdout chunk (state=3): >>>import 'time' # <<< 8238 1726882373.57207: stdout chunk (state=3): >>> import 'zipimport' # <<< 8238 1726882373.57298: stdout chunk (state=3): >>> # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 8238 1726882373.57304: stdout chunk (state=3): >>> <<< 8238 1726882373.57320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882373.57362: stdout chunk (state=3): >>>import '_codecs' # <<< 8238 1726882373.57367: stdout chunk (state=3): >>> <<< 8238 1726882373.57404: stdout chunk (state=3): >>>import 'codecs' # <<< 8238 1726882373.57408: stdout chunk (state=3): >>> <<< 8238 1726882373.57461: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 8238 1726882373.57502: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 8238 1726882373.57525: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dbfc530><<< 8238 1726882373.57544: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dbcbb30><<< 8238 1726882373.57551: stdout chunk (state=3): >>> <<< 8238 1726882373.57586: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 8238 1726882373.57618: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dbfeab0><<< 8238 1726882373.57655: stdout chunk (state=3): >>> import '_signal' # <<< 8238 1726882373.57658: stdout chunk (state=3): >>> <<< 8238 1726882373.57700: stdout chunk (state=3): >>>import '_abc' # <<< 8238 1726882373.57714: stdout chunk (state=3): >>> import 'abc' # <<< 8238 1726882373.57752: stdout chunk (state=3): >>> import 'io' # <<< 8238 1726882373.57757: stdout chunk (state=3): >>> <<< 8238 1726882373.57806: stdout chunk (state=3): >>>import '_stat' # <<< 8238 1726882373.57968: stdout chunk (state=3): >>> import 'stat' # import '_collections_abc' # <<< 8238 1726882373.58020: stdout chunk (state=3): >>>import 'genericpath' # <<< 8238 1726882373.58043: stdout chunk (state=3): >>>import 'posixpath' # <<< 8238 1726882373.58105: stdout chunk (state=3): >>>import 'os' # <<< 8238 1726882373.58107: stdout chunk (state=3): >>> <<< 8238 1726882373.58139: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 8238 1726882373.58167: stdout chunk (state=3): >>>Processing user site-packages <<< 8238 1726882373.58189: stdout chunk (state=3): >>>Processing global site-packages <<< 8238 1726882373.58216: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 8238 1726882373.58254: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 8238 1726882373.58283: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 8238 1726882373.58285: stdout chunk (state=3): >>> <<< 8238 1726882373.58341: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 8238 1726882373.58376: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d9f11c0> <<< 8238 1726882373.58483: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 8238 1726882373.58510: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 8238 1726882373.58524: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d9f2000><<< 8238 1726882373.58574: stdout chunk (state=3): >>> import 'site' # <<< 8238 1726882373.58628: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux <<< 8238 1726882373.58641: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information.<<< 8238 1726882373.58804: stdout chunk (state=3): >>> <<< 8238 1726882373.59353: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 8238 1726882373.59382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 8238 1726882373.59427: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 8238 1726882373.59453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882373.59489: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 8238 1726882373.59566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 8238 1726882373.59607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 8238 1726882373.59611: stdout chunk (state=3): >>> <<< 8238 1726882373.59656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 8238 1726882373.59689: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2fe90> <<< 8238 1726882373.59728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 8238 1726882373.59764: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 8238 1726882373.59778: stdout chunk (state=3): >>> <<< 8238 1726882373.59811: stdout chunk (state=3): >>>import '_operator' # <<< 8238 1726882373.59817: stdout chunk (state=3): >>> <<< 8238 1726882373.59868: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 8238 1726882373.59871: stdout chunk (state=3): >>> <<< 8238 1726882373.59914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc'<<< 8238 1726882373.59963: stdout chunk (state=3): >>> # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 8238 1726882373.60052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882373.60114: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 8238 1726882373.60120: stdout chunk (state=3): >>> <<< 8238 1726882373.60162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da67830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 8238 1726882373.60202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc'<<< 8238 1726882373.60204: stdout chunk (state=3): >>> import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da67ec0> <<< 8238 1726882373.60308: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da47b00> <<< 8238 1726882373.60341: stdout chunk (state=3): >>>import '_functools' # <<< 8238 1726882373.60402: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da451f0><<< 8238 1726882373.60410: stdout chunk (state=3): >>> <<< 8238 1726882373.60572: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2d040> <<< 8238 1726882373.60801: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da8b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da8a3f0> <<< 8238 1726882373.60847: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 8238 1726882373.60850: stdout chunk (state=3): >>> <<< 8238 1726882373.60873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da462a0> <<< 8238 1726882373.60892: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da88bf0> <<< 8238 1726882373.60990: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 8238 1726882373.61033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 8238 1726882373.61038: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dab8830> <<< 8238 1726882373.61051: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2c2f0> <<< 8238 1726882373.61080: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 8238 1726882373.61092: stdout chunk (state=3): >>> <<< 8238 1726882373.61111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 8238 1726882373.61142: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.61153: stdout chunk (state=3): >>> <<< 8238 1726882373.61176: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.61183: stdout chunk (state=3): >>> <<< 8238 1726882373.61206: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dab8ce0> <<< 8238 1726882373.61250: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dab8b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.61261: stdout chunk (state=3): >>> <<< 8238 1726882373.61287: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.61290: stdout chunk (state=3): >>> import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dab8f80><<< 8238 1726882373.61314: stdout chunk (state=3): >>> <<< 8238 1726882373.61334: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2ae40><<< 8238 1726882373.61360: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 8238 1726882373.61364: stdout chunk (state=3): >>> <<< 8238 1726882373.61448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 8238 1726882373.61650: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dab9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dab9340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06daba570> import 'importlib.util' # import 'runpy' # <<< 8238 1726882373.61679: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 8238 1726882373.61847: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dad4770> import 'errno' # <<< 8238 1726882373.61876: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.61916: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dad5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 8238 1726882373.61946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 8238 1726882373.61987: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py<<< 8238 1726882373.61992: stdout chunk (state=3): >>> <<< 8238 1726882373.62035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dad6d50><<< 8238 1726882373.62038: stdout chunk (state=3): >>> <<< 8238 1726882373.62081: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.62124: stdout chunk (state=3): >>> import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dad73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dad62a0><<< 8238 1726882373.62260: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.62265: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dad7e00><<< 8238 1726882373.62288: stdout chunk (state=3): >>> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dad7530> <<< 8238 1726882373.62402: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06daba5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 8238 1726882373.62450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 8238 1726882373.62507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 8238 1726882373.62650: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d80fcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.62678: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.62734: stdout chunk (state=3): >>> import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d8387a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d838500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.62737: stdout chunk (state=3): >>> <<< 8238 1726882373.62740: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.62755: stdout chunk (state=3): >>> import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d838740><<< 8238 1726882373.62852: stdout chunk (state=3): >>> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d838950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d80de50><<< 8238 1726882373.62883: stdout chunk (state=3): >>> <<< 8238 1726882373.62886: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 8238 1726882373.63047: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 8238 1726882373.63092: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 8238 1726882373.63136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 8238 1726882373.63158: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d839fd0> <<< 8238 1726882373.63188: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d838c50> <<< 8238 1726882373.63223: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dabac90> <<< 8238 1726882373.63269: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 8238 1726882373.63378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882373.63400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 8238 1726882373.63479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 8238 1726882373.63613: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d866390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 8238 1726882373.63635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882373.63667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 8238 1726882373.63716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 8238 1726882373.63804: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d87e510><<< 8238 1726882373.63849: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 8238 1726882373.63916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 8238 1726882373.64020: stdout chunk (state=3): >>> import 'ntpath' # <<< 8238 1726882373.64066: stdout chunk (state=3): >>> <<< 8238 1726882373.64093: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d8b72c0> <<< 8238 1726882373.64143: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 8238 1726882373.64165: stdout chunk (state=3): >>> <<< 8238 1726882373.64206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 8238 1726882373.64261: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 8238 1726882373.64325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 8238 1726882373.64488: stdout chunk (state=3): >>> import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d8d9a60> <<< 8238 1726882373.64615: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d8b73e0> <<< 8238 1726882373.64689: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d87ebd0> <<< 8238 1726882373.64761: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc'<<< 8238 1726882373.64783: stdout chunk (state=3): >>> import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d6b8440> <<< 8238 1726882373.64829: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d87d550><<< 8238 1726882373.64850: stdout chunk (state=3): >>> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d83af30> <<< 8238 1726882373.65137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 8238 1726882373.65177: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb06d87d910><<< 8238 1726882373.65230: stdout chunk (state=3): >>> <<< 8238 1726882373.65490: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_22nttasr/ansible_setup_payload.zip' <<< 8238 1726882373.65529: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.65799: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.65821: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 8238 1726882373.65872: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 8238 1726882373.66003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 8238 1726882373.66057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 8238 1726882373.66132: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 8238 1726882373.66155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d7261e0> <<< 8238 1726882373.66174: stdout chunk (state=3): >>>import '_typing' # <<< 8238 1726882373.66298: stdout chunk (state=3): >>> <<< 8238 1726882373.66527: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d6fd0d0> <<< 8238 1726882373.66531: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d6fc230> <<< 8238 1726882373.66570: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.66591: stdout chunk (state=3): >>>import 'ansible' # <<< 8238 1726882373.66628: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.66685: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.66735: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.66739: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 8238 1726882373.66753: stdout chunk (state=3): >>> <<< 8238 1726882373.66806: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.69265: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.71359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 8238 1726882373.71433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d6ff5f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882373.71437: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 8238 1726882373.71485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 8238 1726882373.71488: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 8238 1726882373.71628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d751c70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d751a00> <<< 8238 1726882373.71641: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d751310> <<< 8238 1726882373.71668: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 8238 1726882373.71688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 8238 1726882373.71774: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d751760> <<< 8238 1726882373.71778: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d726e70> <<< 8238 1726882373.71791: stdout chunk (state=3): >>>import 'atexit' # <<< 8238 1726882373.71851: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.71867: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d752a20> <<< 8238 1726882373.71886: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.72262: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d752c30> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d753080> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5bcec0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.72266: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.72269: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d5beae0> <<< 8238 1726882373.72544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5bf3e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c0590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 8238 1726882373.72573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 8238 1726882373.72616: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 8238 1726882373.72696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 8238 1726882373.72734: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c3050> <<< 8238 1726882373.72779: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.72798: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.72802: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d5c3170> <<< 8238 1726882373.72999: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c1310> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 8238 1726882373.73027: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 8238 1726882373.73050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 8238 1726882373.73065: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c6ff0> <<< 8238 1726882373.73089: stdout chunk (state=3): >>>import '_tokenize' # <<< 8238 1726882373.73209: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c5ac0> <<< 8238 1726882373.73227: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c5820> <<< 8238 1726882373.73257: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 8238 1726882373.73280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 8238 1726882373.73418: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c7bc0> <<< 8238 1726882373.73474: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c1820> <<< 8238 1726882373.73511: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.73538: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.73540: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d60b1a0> <<< 8238 1726882373.73727: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60b320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 8238 1726882373.74127: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d60cef0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60cce0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 8238 1726882373.74130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 8238 1726882373.74131: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.74133: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.74134: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d60f3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60d580> <<< 8238 1726882373.74169: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 8238 1726882373.74288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882373.74329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 8238 1726882373.74347: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 8238 1726882373.74396: stdout chunk (state=3): >>>import '_string' # <<< 8238 1726882373.74458: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d61ab70> <<< 8238 1726882373.74740: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60f500><<< 8238 1726882373.74803: stdout chunk (state=3): >>> <<< 8238 1726882373.75033: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.75037: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61b980><<< 8238 1726882373.75039: stdout chunk (state=3): >>> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.75041: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.75043: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61bc80> <<< 8238 1726882373.75213: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61bd10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60b620> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 8238 1726882373.75460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61f500> <<< 8238 1726882373.75696: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.75738: stdout chunk (state=3): >>> <<< 8238 1726882373.75767: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d620800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d61dc70><<< 8238 1726882373.75828: stdout chunk (state=3): >>> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.75863: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61f020> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d61d8e0><<< 8238 1726882373.75887: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.75918: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8238 1726882373.75970: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 8238 1726882373.76135: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.76289: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8238 1726882373.76338: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.76365: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 8238 1726882373.76453: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.76484: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.76487: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 8238 1726882373.76727: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.76950: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.78138: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.78141: stdout chunk (state=3): >>> <<< 8238 1726882373.79171: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 8238 1726882373.79197: stdout chunk (state=3): >>> import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 8238 1726882373.79232: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 8238 1726882373.79238: stdout chunk (state=3): >>> <<< 8238 1726882373.79350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.79363: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d4a88c0><<< 8238 1726882373.79366: stdout chunk (state=3): >>> <<< 8238 1726882373.79512: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 8238 1726882373.79524: stdout chunk (state=3): >>> <<< 8238 1726882373.79530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc'<<< 8238 1726882373.79567: stdout chunk (state=3): >>> import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4a9700><<< 8238 1726882373.79571: stdout chunk (state=3): >>> <<< 8238 1726882373.79595: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d61c1a0><<< 8238 1726882373.79673: stdout chunk (state=3): >>> import 'ansible.module_utils.compat.selinux' # <<< 8238 1726882373.79676: stdout chunk (state=3): >>> <<< 8238 1726882373.79709: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.79716: stdout chunk (state=3): >>> <<< 8238 1726882373.79770: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.79784: stdout chunk (state=3): >>> <<< 8238 1726882373.79813: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 8238 1726882373.79860: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8238 1726882373.80296: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.80454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py<<< 8238 1726882373.80460: stdout chunk (state=3): >>> <<< 8238 1726882373.80480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 8238 1726882373.80498: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4a97c0><<< 8238 1726882373.80503: stdout chunk (state=3): >>> <<< 8238 1726882373.80549: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.81467: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.82474: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.82604: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.82609: stdout chunk (state=3): >>> <<< 8238 1726882373.82739: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 8238 1726882373.82743: stdout chunk (state=3): >>> <<< 8238 1726882373.82794: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.82835: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.82891: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 8238 1726882373.82918: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.82924: stdout chunk (state=3): >>> <<< 8238 1726882373.83193: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 8238 1726882373.83241: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.83276: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 8238 1726882373.83306: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.83325: stdout chunk (state=3): >>> <<< 8238 1726882373.83528: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.83532: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 8238 1726882373.83538: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.83918: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.84368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 8238 1726882373.84485: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 8238 1726882373.84520: stdout chunk (state=3): >>>import '_ast' # <<< 8238 1726882373.84669: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4aa5a0> <<< 8238 1726882373.84684: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.84804: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.84808: stdout chunk (state=3): >>> <<< 8238 1726882373.84934: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 8238 1726882373.84937: stdout chunk (state=3): >>> <<< 8238 1726882373.84959: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 8238 1726882373.85291: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.85298: stdout chunk (state=3): >>> <<< 8238 1726882373.85316: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d4b2120> <<< 8238 1726882373.85394: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.85407: stdout chunk (state=3): >>> <<< 8238 1726882373.85424: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.85434: stdout chunk (state=3): >>> import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d4b2a80><<< 8238 1726882373.85451: stdout chunk (state=3): >>> <<< 8238 1726882373.85459: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4ab440><<< 8238 1726882373.85496: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8238 1726882373.85579: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.85651: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # <<< 8238 1726882373.85654: stdout chunk (state=3): >>> <<< 8238 1726882373.85680: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.85766: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.85773: stdout chunk (state=3): >>> <<< 8238 1726882373.85844: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.85948: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.85955: stdout chunk (state=3): >>> <<< 8238 1726882373.86075: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 8238 1726882373.86084: stdout chunk (state=3): >>> <<< 8238 1726882373.86159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 8238 1726882373.86164: stdout chunk (state=3): >>> <<< 8238 1726882373.86295: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.86302: stdout chunk (state=3): >>> <<< 8238 1726882373.86330: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d4b17f0><<< 8238 1726882373.86340: stdout chunk (state=3): >>> <<< 8238 1726882373.86473: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4b2bd0> import 'ansible.module_utils.common.file' # <<< 8238 1726882373.86479: stdout chunk (state=3): >>> <<< 8238 1726882373.86575: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 8238 1726882373.86578: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.86643: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.86794: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882373.86863: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 8238 1726882373.86866: stdout chunk (state=3): >>> <<< 8238 1726882373.86910: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 8238 1726882373.86941: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 8238 1726882373.86981: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 8238 1726882373.86997: stdout chunk (state=3): >>> <<< 8238 1726882373.87067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 8238 1726882373.87112: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py<<< 8238 1726882373.87147: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 8238 1726882373.87263: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d542c60> <<< 8238 1726882373.87332: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4bc950><<< 8238 1726882373.87336: stdout chunk (state=3): >>> <<< 8238 1726882373.87475: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4b6ae0> <<< 8238 1726882373.87496: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4b6930> <<< 8238 1726882373.87505: stdout chunk (state=3): >>># destroy ansible.module_utils.distro<<< 8238 1726882373.87523: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # <<< 8238 1726882373.87546: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.87607: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.87614: stdout chunk (state=3): >>> <<< 8238 1726882373.87655: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 8238 1726882373.87679: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 8238 1726882373.87801: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 8238 1726882373.87825: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 8238 1726882373.87852: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.87962: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.87969: stdout chunk (state=3): >>> <<< 8238 1726882373.88079: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.88088: stdout chunk (state=3): >>> <<< 8238 1726882373.88120: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.88127: stdout chunk (state=3): >>> <<< 8238 1726882373.88164: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.88170: stdout chunk (state=3): >>> <<< 8238 1726882373.88248: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.88255: stdout chunk (state=3): >>> <<< 8238 1726882373.88337: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.88343: stdout chunk (state=3): >>> <<< 8238 1726882373.88398: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.88466: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.namespace' # <<< 8238 1726882373.88471: stdout chunk (state=3): >>> <<< 8238 1726882373.88498: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.88646: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.88786: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.88828: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.88834: stdout chunk (state=3): >>> <<< 8238 1726882373.88887: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 8238 1726882373.88894: stdout chunk (state=3): >>> <<< 8238 1726882373.89093: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.89246: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.89253: stdout chunk (state=3): >>> <<< 8238 1726882373.89578: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.89583: stdout chunk (state=3): >>> <<< 8238 1726882373.89664: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.89766: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 8238 1726882373.89769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882373.89828: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 8238 1726882373.89864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 8238 1726882373.89903: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 8238 1726882373.89957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 8238 1726882373.89997: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d549a90> <<< 8238 1726882373.90043: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 8238 1726882373.90101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py<<< 8238 1726882373.90106: stdout chunk (state=3): >>> <<< 8238 1726882373.90180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc'<<< 8238 1726882373.90215: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py<<< 8238 1726882373.90220: stdout chunk (state=3): >>> <<< 8238 1726882373.90250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc'<<< 8238 1726882373.90276: stdout chunk (state=3): >>> import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cabc290><<< 8238 1726882373.90326: stdout chunk (state=3): >>> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.90331: stdout chunk (state=3): >>> <<< 8238 1726882373.90366: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.90369: stdout chunk (state=3): >>> import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cabc5c0><<< 8238 1726882373.90461: stdout chunk (state=3): >>> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d525310> <<< 8238 1726882373.90501: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d524260> <<< 8238 1726882373.90556: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5481a0><<< 8238 1726882373.90561: stdout chunk (state=3): >>> <<< 8238 1726882373.90585: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d54bb90> <<< 8238 1726882373.90624: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 8238 1726882373.90726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 8238 1726882373.90760: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc'<<< 8238 1726882373.90796: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 8238 1726882373.90824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc'<<< 8238 1726882373.90860: stdout chunk (state=3): >>> # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.90867: stdout chunk (state=3): >>> # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.90883: stdout chunk (state=3): >>> <<< 8238 1726882373.90894: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cabf590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cabee40><<< 8238 1726882373.90937: stdout chunk (state=3): >>> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.90979: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cabf020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cabe2a0><<< 8238 1726882373.90983: stdout chunk (state=3): >>> <<< 8238 1726882373.91170: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 8238 1726882373.91203: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cabf740><<< 8238 1726882373.91209: stdout chunk (state=3): >>> <<< 8238 1726882373.91253: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py<<< 8238 1726882373.91258: stdout chunk (state=3): >>> <<< 8238 1726882373.91312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc'<<< 8238 1726882373.91317: stdout chunk (state=3): >>> <<< 8238 1726882373.91360: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.91369: stdout chunk (state=3): >>> <<< 8238 1726882373.91402: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.91404: stdout chunk (state=3): >>>import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cb26270> <<< 8238 1726882373.91469: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cb24290> <<< 8238 1726882373.91517: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d54bd10> <<< 8238 1726882373.91543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 8238 1726882373.91568: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.collector' # <<< 8238 1726882373.91577: stdout chunk (state=3): >>> <<< 8238 1726882373.91599: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.91632: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.91642: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.other' # <<< 8238 1726882373.91672: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8238 1726882373.91781: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.91882: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.other.facter' # <<< 8238 1726882373.91888: stdout chunk (state=3): >>> <<< 8238 1726882373.91919: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.91925: stdout chunk (state=3): >>> <<< 8238 1726882373.92021: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.92114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 8238 1726882373.92118: stdout chunk (state=3): >>> <<< 8238 1726882373.92155: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.92158: stdout chunk (state=3): >>> <<< 8238 1726882373.92166: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.92191: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system' # <<< 8238 1726882373.92220: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.92228: stdout chunk (state=3): >>> <<< 8238 1726882373.92323: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 8238 1726882373.92336: stdout chunk (state=3): >>> <<< 8238 1726882373.92352: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.92440: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.92448: stdout chunk (state=3): >>> <<< 8238 1726882373.92531: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available<<< 8238 1726882373.92605: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.92609: stdout chunk (state=3): >>> <<< 8238 1726882373.92680: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 8238 1726882373.92708: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.92713: stdout chunk (state=3): >>> <<< 8238 1726882373.92814: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.92817: stdout chunk (state=3): >>> <<< 8238 1726882373.92921: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.93026: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.93036: stdout chunk (state=3): >>> <<< 8238 1726882373.93132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 8238 1726882373.93144: stdout chunk (state=3): >>> <<< 8238 1726882373.93152: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 8238 1726882373.93184: stdout chunk (state=3): >>> <<< 8238 1726882373.93189: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.93197: stdout chunk (state=3): >>> <<< 8238 1726882373.94115: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.94123: stdout chunk (state=3): >>> <<< 8238 1726882373.94972: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 8238 1726882373.95195: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882373.95246: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.95292: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 8238 1726882373.95317: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 8238 1726882373.95347: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.95394: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.95455: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 8238 1726882373.95464: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.95566: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.95655: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 8238 1726882373.95689: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.95736: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.95788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 8238 1726882373.95811: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.95870: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.95917: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 8238 1726882373.95946: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.96074: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.96248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 8238 1726882373.96386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 8238 1726882373.96389: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cb26630> <<< 8238 1726882373.96495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 8238 1726882373.96573: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cb27290> <<< 8238 1726882373.96595: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 8238 1726882373.96609: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.96714: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.96821: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 8238 1726882373.96850: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.97009: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.97186: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 8238 1726882373.97207: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.97315: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.97449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 8238 1726882373.97462: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.97533: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.97610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 8238 1726882373.97701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 8238 1726882373.97780: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.97900: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cb5a960> <<< 8238 1726882373.98340: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cb43620><<< 8238 1726882373.98343: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.python' # <<< 8238 1726882373.98366: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8238 1726882373.98470: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # <<< 8238 1726882373.98491: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.98670: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882373.98792: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.98806: stdout chunk (state=3): >>> <<< 8238 1726882373.99014: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.99020: stdout chunk (state=3): >>> <<< 8238 1726882373.99270: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 8238 1726882373.99297: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.service_mgr' # <<< 8238 1726882373.99389: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 8238 1726882373.99393: stdout chunk (state=3): >>> <<< 8238 1726882373.99464: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 8238 1726882373.99474: stdout chunk (state=3): >>> <<< 8238 1726882373.99491: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.99498: stdout chunk (state=3): >>> <<< 8238 1726882373.99570: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882373.99652: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 8238 1726882373.99672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc'<<< 8238 1726882373.99679: stdout chunk (state=3): >>> <<< 8238 1726882373.99730: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so'<<< 8238 1726882373.99733: stdout chunk (state=3): >>> <<< 8238 1726882373.99776: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882373.99786: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06c8f6120><<< 8238 1726882373.99794: stdout chunk (state=3): >>> <<< 8238 1726882373.99837: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06c8f6180> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available<<< 8238 1726882373.99877: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8238 1726882373.99879: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware' # <<< 8238 1726882373.99919: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8238 1726882373.99993: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.00065: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware.base' # <<< 8238 1726882374.00071: stdout chunk (state=3): >>> <<< 8238 1726882374.00097: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.00102: stdout chunk (state=3): >>> <<< 8238 1726882374.00415: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.00423: stdout chunk (state=3): >>> <<< 8238 1726882374.00727: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available<<< 8238 1726882374.00926: stdout chunk (state=3): >>> <<< 8238 1726882374.00931: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.01116: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.01198: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.01202: stdout chunk (state=3): >>> <<< 8238 1726882374.01271: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 8238 1726882374.01278: stdout chunk (state=3): >>> <<< 8238 1726882374.01289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 8238 1726882374.01313: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.01335: stdout chunk (state=3): >>> <<< 8238 1726882374.01367: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.01370: stdout chunk (state=3): >>> <<< 8238 1726882374.01417: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.01431: stdout chunk (state=3): >>> <<< 8238 1726882374.01686: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.01747: stdout chunk (state=3): >>> <<< 8238 1726882374.01955: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 8238 1726882374.02055: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available<<< 8238 1726882374.02198: stdout chunk (state=3): >>> <<< 8238 1726882374.02350: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.02645: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882374.03657: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.04616: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 8238 1726882374.04644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 8238 1726882374.04666: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.04864: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.05065: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 8238 1726882374.05070: stdout chunk (state=3): >>> <<< 8238 1726882374.05163: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.05278: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.05489: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware.openbsd' # <<< 8238 1726882374.05503: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.05772: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.06093: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 8238 1726882374.06170: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 8238 1726882374.06203: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.06236: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.06310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 8238 1726882374.06316: stdout chunk (state=3): >>> <<< 8238 1726882374.06340: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.06353: stdout chunk (state=3): >>> <<< 8238 1726882374.06534: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.06538: stdout chunk (state=3): >>> <<< 8238 1726882374.06714: stdout chunk (state=3): >>># zipimport: zlib available<<< 8238 1726882374.06897: stdout chunk (state=3): >>> <<< 8238 1726882374.07131: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.07514: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 8238 1726882374.07550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 8238 1726882374.07555: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.07620: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.07679: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 8238 1726882374.07708: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.07752: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.07787: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 8238 1726882374.10021: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 8238 1726882374.10055: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.10345: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882374.10365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 8238 1726882374.10413: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882374.10493: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.10670: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882374.10777: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 8238 1726882374.10784: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 8238 1726882374.10802: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.10889: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.10952: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 8238 1726882374.10964: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.11435: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.11651: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 8238 1726882374.11655: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.11720: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.11793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 8238 1726882374.11798: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.11871: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.11943: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 8238 1726882374.12101: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.12202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 8238 1726882374.12211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 8238 1726882374.12219: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.12361: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882374.12568: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 8238 1726882374.12852: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 8238 1726882374.12878: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 8238 1726882374.12885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 8238 1726882374.13136: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06c91e8a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06c91d2e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06c91bec0> <<< 8238 1726882375.66530: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "55", "epoch": "1726882375", "epoch_int": "1726882375", "date": "2024-09-20", "time": "21:32:55", "iso8601_micro": "2024-09-21T01:32:55.662091Z", "iso8601": "2024-09-21T01:32:55Z", "iso8601_basic": "20240920T213255662091", "iso8601_basic_short": "20240920T213255", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8238 1726882375.67117: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 8238 1726882375.67147: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser <<< 8238 1726882375.67234: stdout chunk (state=3): >>># cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket <<< 8238 1726882375.67275: stdout chunk (state=3): >>># cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter <<< 8238 1726882375.67571: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base <<< 8238 1726882375.67575: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 8238 1726882375.67686: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 8238 1726882375.67709: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 8238 1726882375.67800: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 8238 1726882375.67951: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 8238 1726882375.67955: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 8238 1726882375.67991: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 8238 1726882375.68020: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 8238 1726882375.68054: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 8238 1726882375.68086: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 8238 1726882375.68098: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 8238 1726882375.68267: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser <<< 8238 1726882375.68323: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 8238 1726882375.68363: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 8238 1726882375.68436: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 8238 1726882375.68582: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 8238 1726882375.68585: stdout chunk (state=3): >>># destroy _collections <<< 8238 1726882375.68599: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 8238 1726882375.68623: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 8238 1726882375.68694: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 8238 1726882375.68698: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 8238 1726882375.68776: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 8238 1726882375.68819: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 8238 1726882375.68909: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 8238 1726882375.68913: stdout chunk (state=3): >>># clear sys.audit hooks <<< 8238 1726882375.69527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882375.69530: stderr chunk (state=3): >>><<< 8238 1726882375.69532: stdout chunk (state=3): >>><<< 8238 1726882375.69840: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dbfc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dbcbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dbfeab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d9f11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d9f2000> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da67830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da67ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da47b00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da451f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da8b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da8a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da462a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da88bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dab8830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dab8ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dab8b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dab8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06da2ae40> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dab9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dab9340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06daba570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dad4770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dad5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dad6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dad73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dad62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06dad7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dad7530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06daba5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d80fcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d8387a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d838500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d838740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d838950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d80de50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d839fd0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d838c50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06dabac90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d866390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d87e510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d8b72c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d8d9a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d8b73e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d87ebd0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d6b8440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d87d550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d83af30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb06d87d910> # zipimport: found 103 names in '/tmp/ansible_setup_payload_22nttasr/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d7261e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d6fd0d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d6fc230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d6ff5f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d751c70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d751a00> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d751310> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d751760> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d726e70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d752a20> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d752c30> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d753080> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5bcec0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d5beae0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5bf3e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c0590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c3050> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d5c3170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c1310> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c6ff0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c5ac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c5820> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c7bc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5c1820> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d60b1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60b320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d60cef0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60cce0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d60f3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60d580> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d61ab70> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60f500> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61b980> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61bc80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61bd10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d60b620> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61f500> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d620800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d61dc70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d61f020> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d61d8e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d4a88c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4a9700> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d61c1a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4a97c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4aa5a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d4b2120> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d4b2a80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4ab440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06d4b17f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4b2bd0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d542c60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4bc950> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4b6ae0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d4b6930> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d549a90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cabc290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cabc5c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d525310> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d524260> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d5481a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d54bb90> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cabf590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cabee40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cabf020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cabe2a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cabf740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cb26270> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cb24290> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06d54bd10> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cb26630> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cb27290> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06cb5a960> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06cb43620> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06c8f6120> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06c8f6180> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb06c91e8a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06c91d2e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb06c91bec0> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "55", "epoch": "1726882375", "epoch_int": "1726882375", "date": "2024-09-20", "time": "21:32:55", "iso8601_micro": "2024-09-21T01:32:55.662091Z", "iso8601": "2024-09-21T01:32:55Z", "iso8601_basic": "20240920T213255662091", "iso8601_basic_short": "20240920T213255", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 8238 1726882375.71257: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882375.71260: _low_level_execute_command(): starting 8238 1726882375.71262: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882373.3735702-8426-221580983196120/ > /dev/null 2>&1 && sleep 0' 8238 1726882375.71290: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882375.71300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882375.71311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882375.71369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882375.71427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882375.71445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882375.71650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882375.71753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882375.73736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882375.73866: stdout chunk (state=3): >>><<< 8238 1726882375.73870: stderr chunk (state=3): >>><<< 8238 1726882375.73873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882375.73875: handler run complete 8238 1726882375.73912: variable 'ansible_facts' from source: unknown 8238 1726882375.74085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882375.74519: variable 'ansible_facts' from source: unknown 8238 1726882375.74522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882375.74665: attempt loop complete, returning result 8238 1726882375.74674: _execute() done 8238 1726882375.74681: dumping result to json 8238 1726882375.74699: done dumping result, returning 8238 1726882375.74711: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affc7ec-ae25-54bc-d334-0000000000dd] 8238 1726882375.74723: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000dd ok: [managed_node3] 8238 1726882375.75219: no more pending results, returning what we have 8238 1726882375.75526: results queue empty 8238 1726882375.75528: checking for any_errors_fatal 8238 1726882375.75529: done checking for any_errors_fatal 8238 1726882375.75530: checking for max_fail_percentage 8238 1726882375.75531: done checking for max_fail_percentage 8238 1726882375.75532: checking to see if all hosts have failed and the running result is not ok 8238 1726882375.75533: done checking to see if all hosts have failed 8238 1726882375.75534: getting the remaining hosts for this loop 8238 1726882375.75535: done getting the remaining hosts for this loop 8238 1726882375.75539: getting the next task for host managed_node3 8238 1726882375.75548: done getting next task for host managed_node3 8238 1726882375.75550: ^ task is: TASK: Check if system is ostree 8238 1726882375.75553: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882375.75556: getting variables 8238 1726882375.75557: in VariableManager get_vars() 8238 1726882375.75578: Calling all_inventory to load vars for managed_node3 8238 1726882375.75581: Calling groups_inventory to load vars for managed_node3 8238 1726882375.75584: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882375.75591: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000dd 8238 1726882375.75593: WORKER PROCESS EXITING 8238 1726882375.75603: Calling all_plugins_play to load vars for managed_node3 8238 1726882375.75606: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882375.75609: Calling groups_plugins_play to load vars for managed_node3 8238 1726882375.75883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882375.76087: done with get_vars() 8238 1726882375.76097: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:32:55 -0400 (0:00:02.505) 0:00:05.916 ****** 8238 1726882375.76191: entering _queue_task() for managed_node3/stat 8238 1726882375.76438: worker is 1 (out of 1 available) 8238 1726882375.76452: exiting _queue_task() for managed_node3/stat 8238 1726882375.76463: done queuing things up, now waiting for results queue to drain 8238 1726882375.76465: waiting for pending results... 8238 1726882375.76707: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 8238 1726882375.76841: in run() - task 0affc7ec-ae25-54bc-d334-0000000000df 8238 1726882375.76862: variable 'ansible_search_path' from source: unknown 8238 1726882375.76870: variable 'ansible_search_path' from source: unknown 8238 1726882375.76910: calling self._execute() 8238 1726882375.76990: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882375.77003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882375.77017: variable 'omit' from source: magic vars 8238 1726882375.77518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882375.77820: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882375.77871: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882375.77915: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882375.77958: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882375.78175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882375.78211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882375.78247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882375.78282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882375.78426: Evaluated conditional (not __network_is_ostree is defined): True 8238 1726882375.78439: variable 'omit' from source: magic vars 8238 1726882375.78483: variable 'omit' from source: magic vars 8238 1726882375.78532: variable 'omit' from source: magic vars 8238 1726882375.78563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882375.78597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882375.78626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882375.78650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882375.78667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882375.78704: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882375.78711: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882375.78717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882375.78828: Set connection var ansible_connection to ssh 8238 1726882375.78840: Set connection var ansible_shell_type to sh 8238 1726882375.78850: Set connection var ansible_pipelining to False 8238 1726882375.78859: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882375.78869: Set connection var ansible_timeout to 10 8238 1726882375.78927: Set connection var ansible_shell_executable to /bin/sh 8238 1726882375.78931: variable 'ansible_shell_executable' from source: unknown 8238 1726882375.78933: variable 'ansible_connection' from source: unknown 8238 1726882375.78936: variable 'ansible_module_compression' from source: unknown 8238 1726882375.78938: variable 'ansible_shell_type' from source: unknown 8238 1726882375.78940: variable 'ansible_shell_executable' from source: unknown 8238 1726882375.78943: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882375.78945: variable 'ansible_pipelining' from source: unknown 8238 1726882375.78947: variable 'ansible_timeout' from source: unknown 8238 1726882375.78949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882375.79128: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882375.79131: variable 'omit' from source: magic vars 8238 1726882375.79133: starting attempt loop 8238 1726882375.79135: running the handler 8238 1726882375.79137: _low_level_execute_command(): starting 8238 1726882375.79140: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882375.80330: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882375.80339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882375.80342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882375.80345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882375.80455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882375.80533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882375.82257: stdout chunk (state=3): >>>/root <<< 8238 1726882375.82508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882375.82512: stdout chunk (state=3): >>><<< 8238 1726882375.82515: stderr chunk (state=3): >>><<< 8238 1726882375.82605: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882375.82976: _low_level_execute_command(): starting 8238 1726882375.82980: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562 `" && echo ansible-tmp-1726882375.8258116-8511-27877307942562="` echo /root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562 `" ) && sleep 0' 8238 1726882375.83984: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882375.84147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882375.84151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882375.84154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882375.84156: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882375.84159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882375.84234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882375.84249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882375.84391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882375.86465: stdout chunk (state=3): >>>ansible-tmp-1726882375.8258116-8511-27877307942562=/root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562 <<< 8238 1726882375.86551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882375.86842: stderr chunk (state=3): >>><<< 8238 1726882375.86857: stdout chunk (state=3): >>><<< 8238 1726882375.86875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882375.8258116-8511-27877307942562=/root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882375.87055: variable 'ansible_module_compression' from source: unknown 8238 1726882375.87283: ANSIBALLZ: Using lock for stat 8238 1726882375.87286: ANSIBALLZ: Acquiring lock 8238 1726882375.87289: ANSIBALLZ: Lock acquired: 140036204254736 8238 1726882375.87291: ANSIBALLZ: Creating module 8238 1726882376.05137: ANSIBALLZ: Writing module into payload 8238 1726882376.05239: ANSIBALLZ: Writing module 8238 1726882376.05269: ANSIBALLZ: Renaming module 8238 1726882376.05281: ANSIBALLZ: Done creating module 8238 1726882376.05305: variable 'ansible_facts' from source: unknown 8238 1726882376.05389: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/AnsiballZ_stat.py 8238 1726882376.05634: Sending initial data 8238 1726882376.05644: Sent initial data (150 bytes) 8238 1726882376.06220: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882376.06336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882376.06366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882376.06485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882376.08207: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882376.08297: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882376.08390: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpo1lso8tm /root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/AnsiballZ_stat.py <<< 8238 1726882376.08394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/AnsiballZ_stat.py" <<< 8238 1726882376.08474: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpo1lso8tm" to remote "/root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/AnsiballZ_stat.py" <<< 8238 1726882376.09492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882376.09629: stderr chunk (state=3): >>><<< 8238 1726882376.09633: stdout chunk (state=3): >>><<< 8238 1726882376.09667: done transferring module to remote 8238 1726882376.09702: _low_level_execute_command(): starting 8238 1726882376.09856: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/ /root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/AnsiballZ_stat.py && sleep 0' 8238 1726882376.11093: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882376.11258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882376.11297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882376.11494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882376.13519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882376.13534: stdout chunk (state=3): >>><<< 8238 1726882376.13553: stderr chunk (state=3): >>><<< 8238 1726882376.13578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882376.13581: _low_level_execute_command(): starting 8238 1726882376.13584: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/AnsiballZ_stat.py && sleep 0' 8238 1726882376.14374: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882376.14378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882376.14473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882376.14478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882376.14480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882376.14483: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882376.14485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882376.14492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882376.14495: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882376.14497: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8238 1726882376.14499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882376.14501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882376.14513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882376.14519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882376.14528: stderr chunk (state=3): >>>debug2: match found <<< 8238 1726882376.14539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882376.14799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882376.14803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882376.14873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882376.17566: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 8238 1726882376.17571: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 8238 1726882376.17606: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb921a4530> <<< 8238 1726882376.17643: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb92173b30> <<< 8238 1726882376.17653: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb921a6ab0> <<< 8238 1726882376.17681: stdout chunk (state=3): >>>import '_signal' # <<< 8238 1726882376.17704: stdout chunk (state=3): >>>import '_abc' # <<< 8238 1726882376.17733: stdout chunk (state=3): >>>import 'abc' # <<< 8238 1726882376.17784: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 8238 1726882376.17912: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 8238 1726882376.17940: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 8238 1726882376.17967: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 8238 1726882376.17989: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 8238 1726882376.18010: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 8238 1726882376.18050: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f551c0> <<< 8238 1726882376.18092: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 8238 1726882376.18102: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f55fd0> <<< 8238 1726882376.18141: stdout chunk (state=3): >>>import 'site' # <<< 8238 1726882376.18162: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 8238 1726882376.18430: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 8238 1726882376.18434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 8238 1726882376.18472: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 8238 1726882376.18514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 8238 1726882376.18526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 8238 1726882376.18580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 8238 1726882376.18639: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f93e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 8238 1726882376.18693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f93f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 8238 1726882376.18728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 8238 1726882376.18754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882376.18809: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 8238 1726882376.18849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fcb890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 8238 1726882376.18852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fcbf20> import '_collections' # <<< 8238 1726882376.18912: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fabb30> <<< 8238 1726882376.18929: stdout chunk (state=3): >>>import '_functools' # <<< 8238 1726882376.18941: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fa9250> <<< 8238 1726882376.19043: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f91010> <<< 8238 1726882376.19076: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 8238 1726882376.19094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 8238 1726882376.19121: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 8238 1726882376.19166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 8238 1726882376.19181: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 8238 1726882376.19201: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fef830> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fee450> <<< 8238 1726882376.19251: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91faa0f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fecc50> <<< 8238 1726882376.19325: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 8238 1726882376.19329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201c890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f902c0> <<< 8238 1726882376.19382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 8238 1726882376.19390: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9201cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201cbf0> <<< 8238 1726882376.19426: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9201cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f8ede0> <<< 8238 1726882376.19464: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882376.19479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 8238 1726882376.19531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 8238 1726882376.19563: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201d370> import 'importlib.machinery' # <<< 8238 1726882376.19617: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 8238 1726882376.19656: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201e570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 8238 1726882376.19704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb920387a0> <<< 8238 1726882376.19736: stdout chunk (state=3): >>>import 'errno' # <<< 8238 1726882376.19765: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb92039ee0> <<< 8238 1726882376.19806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 8238 1726882376.19819: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9203ad80> <<< 8238 1726882376.19867: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9203b3b0> <<< 8238 1726882376.19898: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9203a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 8238 1726882376.19944: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882376.19957: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9203be00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9203b560> <<< 8238 1726882376.20009: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201e5d0> <<< 8238 1726882376.20055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 8238 1726882376.20058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 8238 1726882376.20078: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 8238 1726882376.20107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 8238 1726882376.20149: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91dc3cb0> <<< 8238 1726882376.20179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91dec710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91dec470> <<< 8238 1726882376.20214: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91dec740> <<< 8238 1726882376.20238: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91dec920> <<< 8238 1726882376.20270: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91dc1e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 8238 1726882376.20416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 8238 1726882376.20420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91dedf70> <<< 8238 1726882376.20493: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91decbf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201ecc0> <<< 8238 1726882376.20526: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 8238 1726882376.20541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 8238 1726882376.20602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 8238 1726882376.20645: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e1a300> <<< 8238 1726882376.20671: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 8238 1726882376.20715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882376.20718: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 8238 1726882376.20788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 8238 1726882376.20801: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e324b0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 8238 1726882376.20846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 8238 1726882376.20894: stdout chunk (state=3): >>>import 'ntpath' # <<< 8238 1726882376.20930: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e6f260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 8238 1726882376.20988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 8238 1726882376.21000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 8238 1726882376.21037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 8238 1726882376.21124: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e8da00> <<< 8238 1726882376.21236: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e6f380> <<< 8238 1726882376.21319: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e33140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91cb0320> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e314f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91deeea0> <<< 8238 1726882376.21409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 8238 1726882376.21413: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7feb91cb05f0> <<< 8238 1726882376.21510: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_k_vtp8wm/ansible_stat_payload.zip' # zipimport: zlib available <<< 8238 1726882376.21649: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.21680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 8238 1726882376.21714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 8238 1726882376.21807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 8238 1726882376.21833: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d060f0> import '_typing' # <<< 8238 1726882376.22048: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91cdcfe0> <<< 8238 1726882376.22093: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91cdc170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 8238 1726882376.22116: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8238 1726882376.22145: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 8238 1726882376.23677: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.24977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91cdfe00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 8238 1726882376.25036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 8238 1726882376.25153: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91d31a30> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d317c0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d310d0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 8238 1726882376.25301: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d31b50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d06b10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91d32810> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91d32a50> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 8238 1726882376.25387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 8238 1726882376.25419: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d32f90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 8238 1726882376.25498: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b94d40> <<< 8238 1726882376.25541: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91b96960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 8238 1726882376.25591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b97320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 8238 1726882376.25631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b98500> <<< 8238 1726882376.25730: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 8238 1726882376.26020: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9af90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91b9b0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b99250> <<< 8238 1726882376.26081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 8238 1726882376.26084: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 8238 1726882376.26252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9ef90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9da90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9d7f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 8238 1726882376.26257: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9fe90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b99760> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91be7110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91be7260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 8238 1726882376.26284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 8238 1726882376.26326: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91be8e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91be8c20> <<< 8238 1726882376.26355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 8238 1726882376.26435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 8238 1726882376.26520: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91beb3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91be9520> <<< 8238 1726882376.26596: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882376.26599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 8238 1726882376.26673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91bf2c00> <<< 8238 1726882376.26786: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91beb590> <<< 8238 1726882376.26835: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf3920> <<< 8238 1726882376.26873: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf3c50> <<< 8238 1726882376.26928: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf3ec0> <<< 8238 1726882376.26956: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91be7560> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 8238 1726882376.27000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 8238 1726882376.27087: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf7530> <<< 8238 1726882376.27241: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf88c0> <<< 8238 1726882376.27273: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91bf5cd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf7050> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91bf5970> <<< 8238 1726882376.27304: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 8238 1726882376.27399: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.27551: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.27555: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 8238 1726882376.27588: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 8238 1726882376.27687: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.27824: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.28462: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.29057: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 8238 1726882376.29060: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 8238 1726882376.29139: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882376.29237: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91c7ca40> <<< 8238 1726882376.29240: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91c7d940> <<< 8238 1726882376.29336: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d06060> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882376.29382: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 8238 1726882376.29509: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.29689: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91c7d640> # zipimport: zlib available <<< 8238 1726882376.30243: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.30728: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.30771: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.30917: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 8238 1726882376.30968: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 8238 1726882376.31068: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.31166: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available <<< 8238 1726882376.31458: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 8238 1726882376.31520: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.31745: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 8238 1726882376.31809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 8238 1726882376.31825: stdout chunk (state=3): >>>import '_ast' # <<< 8238 1726882376.31897: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91c7e780> <<< 8238 1726882376.31914: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.31973: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.32061: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 8238 1726882376.32086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 8238 1726882376.32170: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882376.32302: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91a8e300><<< 8238 1726882376.32312: stdout chunk (state=3): >>> <<< 8238 1726882376.32370: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91a8ec00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91c7f7d0> <<< 8238 1726882376.32384: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.32421: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.32510: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 8238 1726882376.32513: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.32531: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.32567: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.32627: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.32711: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 8238 1726882376.32754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882376.32828: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 8238 1726882376.32862: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91a8d8b0> <<< 8238 1726882376.32870: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91a8edb0> <<< 8238 1726882376.32894: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 8238 1726882376.32974: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.33055: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.33066: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.33105: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 8238 1726882376.33136: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 8238 1726882376.33169: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 8238 1726882376.33181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 8238 1726882376.33233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 8238 1726882376.33267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 8238 1726882376.33276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 8238 1726882376.33360: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b1ef90> <<< 8238 1726882376.33396: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91a9be00> <<< 8238 1726882376.33475: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91a92ea0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91a92cf0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 8238 1726882376.33532: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 8238 1726882376.33630: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 8238 1726882376.33633: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 8238 1726882376.33684: stdout chunk (state=3): >>> import 'ansible.modules' # # zipimport: zlib available <<< 8238 1726882376.33771: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.34115: stdout chunk (state=3): >>># zipimport: zlib available <<< 8238 1726882376.34133: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 8238 1726882376.34385: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 8238 1726882376.34400: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type <<< 8238 1726882376.34425: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools <<< 8238 1726882376.34655: stdout chunk (state=3): >>># cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 8238 1726882376.34779: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 8238 1726882376.34833: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 8238 1726882376.34839: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 8238 1726882376.34871: stdout chunk (state=3): >>># destroy ntpath <<< 8238 1726882376.34914: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select <<< 8238 1726882376.34957: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array <<< 8238 1726882376.35201: stdout chunk (state=3): >>># destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro <<< 8238 1726882376.35299: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 8238 1726882376.35420: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 8238 1726882376.35460: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 8238 1726882376.35644: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 8238 1726882376.35650: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 8238 1726882376.36155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882376.36234: stderr chunk (state=3): >>><<< 8238 1726882376.36237: stdout chunk (state=3): >>><<< 8238 1726882376.36304: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb921a4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb92173b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb921a6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f551c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f55fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f93e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f93f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fcb890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fcbf20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fabb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fa9250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f91010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fef830> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fee450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91faa0f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91fecc50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201c890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f902c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9201cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201cbf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9201cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91f8ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201d370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201e570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb920387a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb92039ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9203ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9203b3b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9203a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9203be00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9203b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201e5d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91dc3cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91dec710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91dec470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91dec740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91dec920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91dc1e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91dedf70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91decbf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9201ecc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e1a300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e324b0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e6f260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e8da00> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e6f380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e33140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91cb0320> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91e314f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91deeea0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7feb91cb05f0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_k_vtp8wm/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d060f0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91cdcfe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91cdc170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91cdfe00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91d31a30> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d317c0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d310d0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d31b50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d06b10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91d32810> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91d32a50> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d32f90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b94d40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91b96960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b97320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b98500> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9af90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91b9b0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b99250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9ef90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9da90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9d7f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b9fe90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b99760> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91be7110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91be7260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91be8e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91be8c20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91beb3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91be9520> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91bf2c00> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91beb590> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf3920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf3c50> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf3ec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91be7560> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf7530> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf88c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91bf5cd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91bf7050> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91bf5970> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91c7ca40> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91c7d940> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91d06060> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91c7d640> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91c7e780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91a8e300> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91a8ec00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91c7f7d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb91a8d8b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91a8edb0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91b1ef90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91a9be00> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91a92ea0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb91a92cf0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 8238 1726882376.37205: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882376.37209: _low_level_execute_command(): starting 8238 1726882376.37211: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882375.8258116-8511-27877307942562/ > /dev/null 2>&1 && sleep 0' 8238 1726882376.37635: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882376.37640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882376.37642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882376.37645: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882376.37648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882376.37838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882376.37960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882376.40032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882376.40036: stdout chunk (state=3): >>><<< 8238 1726882376.40039: stderr chunk (state=3): >>><<< 8238 1726882376.40041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882376.40048: handler run complete 8238 1726882376.40050: attempt loop complete, returning result 8238 1726882376.40053: _execute() done 8238 1726882376.40085: dumping result to json 8238 1726882376.40102: done dumping result, returning 8238 1726882376.40127: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0affc7ec-ae25-54bc-d334-0000000000df] 8238 1726882376.40166: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000df ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8238 1726882376.40361: no more pending results, returning what we have 8238 1726882376.40365: results queue empty 8238 1726882376.40367: checking for any_errors_fatal 8238 1726882376.40373: done checking for any_errors_fatal 8238 1726882376.40374: checking for max_fail_percentage 8238 1726882376.40376: done checking for max_fail_percentage 8238 1726882376.40377: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.40377: done checking to see if all hosts have failed 8238 1726882376.40378: getting the remaining hosts for this loop 8238 1726882376.40380: done getting the remaining hosts for this loop 8238 1726882376.40386: getting the next task for host managed_node3 8238 1726882376.40393: done getting next task for host managed_node3 8238 1726882376.40396: ^ task is: TASK: Set flag to indicate system is ostree 8238 1726882376.40401: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.40405: getting variables 8238 1726882376.40406: in VariableManager get_vars() 8238 1726882376.40439: Calling all_inventory to load vars for managed_node3 8238 1726882376.40443: Calling groups_inventory to load vars for managed_node3 8238 1726882376.40446: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.40458: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.40461: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.40465: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.41078: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000df 8238 1726882376.41081: WORKER PROCESS EXITING 8238 1726882376.41115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.41427: done with get_vars() 8238 1726882376.41443: done getting variables 8238 1726882376.41585: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:32:56 -0400 (0:00:00.654) 0:00:06.571 ****** 8238 1726882376.41616: entering _queue_task() for managed_node3/set_fact 8238 1726882376.41618: Creating lock for set_fact 8238 1726882376.41921: worker is 1 (out of 1 available) 8238 1726882376.42035: exiting _queue_task() for managed_node3/set_fact 8238 1726882376.42058: done queuing things up, now waiting for results queue to drain 8238 1726882376.42060: waiting for pending results... 8238 1726882376.42282: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 8238 1726882376.42448: in run() - task 0affc7ec-ae25-54bc-d334-0000000000e0 8238 1726882376.42467: variable 'ansible_search_path' from source: unknown 8238 1726882376.42475: variable 'ansible_search_path' from source: unknown 8238 1726882376.42539: calling self._execute() 8238 1726882376.42657: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.42670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.42683: variable 'omit' from source: magic vars 8238 1726882376.43449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882376.43720: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882376.43780: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882376.43825: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882376.43884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882376.44018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882376.44057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882376.44127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882376.44130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882376.44262: Evaluated conditional (not __network_is_ostree is defined): True 8238 1726882376.44282: variable 'omit' from source: magic vars 8238 1726882376.44330: variable 'omit' from source: magic vars 8238 1726882376.44463: variable '__ostree_booted_stat' from source: set_fact 8238 1726882376.44610: variable 'omit' from source: magic vars 8238 1726882376.44614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882376.44617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882376.44620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882376.44645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882376.44661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882376.44701: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882376.44738: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.44742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.44861: Set connection var ansible_connection to ssh 8238 1726882376.44869: Set connection var ansible_shell_type to sh 8238 1726882376.44879: Set connection var ansible_pipelining to False 8238 1726882376.44936: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882376.44939: Set connection var ansible_timeout to 10 8238 1726882376.44942: Set connection var ansible_shell_executable to /bin/sh 8238 1726882376.44946: variable 'ansible_shell_executable' from source: unknown 8238 1726882376.44960: variable 'ansible_connection' from source: unknown 8238 1726882376.44967: variable 'ansible_module_compression' from source: unknown 8238 1726882376.44973: variable 'ansible_shell_type' from source: unknown 8238 1726882376.44979: variable 'ansible_shell_executable' from source: unknown 8238 1726882376.44985: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.45028: variable 'ansible_pipelining' from source: unknown 8238 1726882376.45031: variable 'ansible_timeout' from source: unknown 8238 1726882376.45040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.45127: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882376.45153: variable 'omit' from source: magic vars 8238 1726882376.45164: starting attempt loop 8238 1726882376.45182: running the handler 8238 1726882376.45228: handler run complete 8238 1726882376.45231: attempt loop complete, returning result 8238 1726882376.45233: _execute() done 8238 1726882376.45236: dumping result to json 8238 1726882376.45238: done dumping result, returning 8238 1726882376.45240: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0affc7ec-ae25-54bc-d334-0000000000e0] 8238 1726882376.45261: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000e0 8238 1726882376.45441: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000e0 8238 1726882376.45444: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 8238 1726882376.45505: no more pending results, returning what we have 8238 1726882376.45508: results queue empty 8238 1726882376.45510: checking for any_errors_fatal 8238 1726882376.45516: done checking for any_errors_fatal 8238 1726882376.45518: checking for max_fail_percentage 8238 1726882376.45520: done checking for max_fail_percentage 8238 1726882376.45520: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.45521: done checking to see if all hosts have failed 8238 1726882376.45625: getting the remaining hosts for this loop 8238 1726882376.45627: done getting the remaining hosts for this loop 8238 1726882376.45630: getting the next task for host managed_node3 8238 1726882376.45639: done getting next task for host managed_node3 8238 1726882376.45641: ^ task is: TASK: Fix CentOS6 Base repo 8238 1726882376.45644: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.45650: getting variables 8238 1726882376.45651: in VariableManager get_vars() 8238 1726882376.45683: Calling all_inventory to load vars for managed_node3 8238 1726882376.45686: Calling groups_inventory to load vars for managed_node3 8238 1726882376.45689: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.45697: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.45700: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.45706: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.45909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.46148: done with get_vars() 8238 1726882376.46158: done getting variables 8238 1726882376.46277: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:32:56 -0400 (0:00:00.047) 0:00:06.618 ****** 8238 1726882376.46340: entering _queue_task() for managed_node3/copy 8238 1726882376.46578: worker is 1 (out of 1 available) 8238 1726882376.46590: exiting _queue_task() for managed_node3/copy 8238 1726882376.46601: done queuing things up, now waiting for results queue to drain 8238 1726882376.46603: waiting for pending results... 8238 1726882376.47070: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 8238 1726882376.47086: in run() - task 0affc7ec-ae25-54bc-d334-0000000000e2 8238 1726882376.47131: variable 'ansible_search_path' from source: unknown 8238 1726882376.47135: variable 'ansible_search_path' from source: unknown 8238 1726882376.47188: calling self._execute() 8238 1726882376.47298: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.47305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.47374: variable 'omit' from source: magic vars 8238 1726882376.48369: variable 'ansible_distribution' from source: facts 8238 1726882376.48390: Evaluated conditional (ansible_distribution == 'CentOS'): False 8238 1726882376.48394: when evaluation is False, skipping this task 8238 1726882376.48396: _execute() done 8238 1726882376.48399: dumping result to json 8238 1726882376.48528: done dumping result, returning 8238 1726882376.48531: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0affc7ec-ae25-54bc-d334-0000000000e2] 8238 1726882376.48534: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000e2 8238 1726882376.48616: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000e2 8238 1726882376.48620: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 8238 1726882376.48705: no more pending results, returning what we have 8238 1726882376.48708: results queue empty 8238 1726882376.48714: checking for any_errors_fatal 8238 1726882376.48723: done checking for any_errors_fatal 8238 1726882376.48724: checking for max_fail_percentage 8238 1726882376.48726: done checking for max_fail_percentage 8238 1726882376.48727: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.48728: done checking to see if all hosts have failed 8238 1726882376.48728: getting the remaining hosts for this loop 8238 1726882376.48730: done getting the remaining hosts for this loop 8238 1726882376.48735: getting the next task for host managed_node3 8238 1726882376.48751: done getting next task for host managed_node3 8238 1726882376.48754: ^ task is: TASK: Include the task 'enable_epel.yml' 8238 1726882376.48758: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.48763: getting variables 8238 1726882376.48770: in VariableManager get_vars() 8238 1726882376.48808: Calling all_inventory to load vars for managed_node3 8238 1726882376.48814: Calling groups_inventory to load vars for managed_node3 8238 1726882376.48819: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.49025: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.49028: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.49030: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.49167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.49450: done with get_vars() 8238 1726882376.49459: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:32:56 -0400 (0:00:00.032) 0:00:06.651 ****** 8238 1726882376.49595: entering _queue_task() for managed_node3/include_tasks 8238 1726882376.49902: worker is 1 (out of 1 available) 8238 1726882376.49920: exiting _queue_task() for managed_node3/include_tasks 8238 1726882376.49934: done queuing things up, now waiting for results queue to drain 8238 1726882376.49936: waiting for pending results... 8238 1726882376.50359: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 8238 1726882376.50531: in run() - task 0affc7ec-ae25-54bc-d334-0000000000e3 8238 1726882376.50538: variable 'ansible_search_path' from source: unknown 8238 1726882376.50542: variable 'ansible_search_path' from source: unknown 8238 1726882376.50643: calling self._execute() 8238 1726882376.50715: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.50723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.50735: variable 'omit' from source: magic vars 8238 1726882376.51356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882376.53819: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882376.54128: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882376.54132: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882376.54134: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882376.54137: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882376.54140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882376.54142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882376.54173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882376.54224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882376.54246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882376.54377: variable '__network_is_ostree' from source: set_fact 8238 1726882376.54402: Evaluated conditional (not __network_is_ostree | d(false)): True 8238 1726882376.54415: _execute() done 8238 1726882376.54425: dumping result to json 8238 1726882376.54434: done dumping result, returning 8238 1726882376.54445: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0affc7ec-ae25-54bc-d334-0000000000e3] 8238 1726882376.54456: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000e3 8238 1726882376.54593: no more pending results, returning what we have 8238 1726882376.54598: in VariableManager get_vars() 8238 1726882376.54634: Calling all_inventory to load vars for managed_node3 8238 1726882376.54638: Calling groups_inventory to load vars for managed_node3 8238 1726882376.54642: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.54656: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.54659: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.54662: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.54834: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000e3 8238 1726882376.54838: WORKER PROCESS EXITING 8238 1726882376.54896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.55093: done with get_vars() 8238 1726882376.55102: variable 'ansible_search_path' from source: unknown 8238 1726882376.55103: variable 'ansible_search_path' from source: unknown 8238 1726882376.55146: we have included files to process 8238 1726882376.55148: generating all_blocks data 8238 1726882376.55149: done generating all_blocks data 8238 1726882376.55155: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 8238 1726882376.55157: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 8238 1726882376.55160: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 8238 1726882376.55925: done processing included file 8238 1726882376.55928: iterating over new_blocks loaded from include file 8238 1726882376.55929: in VariableManager get_vars() 8238 1726882376.55941: done with get_vars() 8238 1726882376.55943: filtering new block on tags 8238 1726882376.55968: done filtering new block on tags 8238 1726882376.55971: in VariableManager get_vars() 8238 1726882376.55982: done with get_vars() 8238 1726882376.55984: filtering new block on tags 8238 1726882376.55997: done filtering new block on tags 8238 1726882376.55999: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 8238 1726882376.56005: extending task lists for all hosts with included blocks 8238 1726882376.56118: done extending task lists 8238 1726882376.56120: done processing included files 8238 1726882376.56121: results queue empty 8238 1726882376.56123: checking for any_errors_fatal 8238 1726882376.56128: done checking for any_errors_fatal 8238 1726882376.56129: checking for max_fail_percentage 8238 1726882376.56130: done checking for max_fail_percentage 8238 1726882376.56131: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.56132: done checking to see if all hosts have failed 8238 1726882376.56132: getting the remaining hosts for this loop 8238 1726882376.56133: done getting the remaining hosts for this loop 8238 1726882376.56136: getting the next task for host managed_node3 8238 1726882376.56140: done getting next task for host managed_node3 8238 1726882376.56142: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 8238 1726882376.56146: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.56148: getting variables 8238 1726882376.56149: in VariableManager get_vars() 8238 1726882376.56157: Calling all_inventory to load vars for managed_node3 8238 1726882376.56159: Calling groups_inventory to load vars for managed_node3 8238 1726882376.56162: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.56167: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.56176: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.56179: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.56348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.56555: done with get_vars() 8238 1726882376.56564: done getting variables 8238 1726882376.56631: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 8238 1726882376.56841: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:32:56 -0400 (0:00:00.072) 0:00:06.723 ****** 8238 1726882376.56889: entering _queue_task() for managed_node3/command 8238 1726882376.56891: Creating lock for command 8238 1726882376.57176: worker is 1 (out of 1 available) 8238 1726882376.57188: exiting _queue_task() for managed_node3/command 8238 1726882376.57201: done queuing things up, now waiting for results queue to drain 8238 1726882376.57203: waiting for pending results... 8238 1726882376.57458: running TaskExecutor() for managed_node3/TASK: Create EPEL 40 8238 1726882376.57589: in run() - task 0affc7ec-ae25-54bc-d334-0000000000fd 8238 1726882376.57609: variable 'ansible_search_path' from source: unknown 8238 1726882376.57618: variable 'ansible_search_path' from source: unknown 8238 1726882376.57667: calling self._execute() 8238 1726882376.57747: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.57765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.57781: variable 'omit' from source: magic vars 8238 1726882376.58183: variable 'ansible_distribution' from source: facts 8238 1726882376.58231: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8238 1726882376.58235: when evaluation is False, skipping this task 8238 1726882376.58237: _execute() done 8238 1726882376.58239: dumping result to json 8238 1726882376.58241: done dumping result, returning 8238 1726882376.58243: done running TaskExecutor() for managed_node3/TASK: Create EPEL 40 [0affc7ec-ae25-54bc-d334-0000000000fd] 8238 1726882376.58245: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000fd skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8238 1726882376.58567: no more pending results, returning what we have 8238 1726882376.58571: results queue empty 8238 1726882376.58572: checking for any_errors_fatal 8238 1726882376.58573: done checking for any_errors_fatal 8238 1726882376.58574: checking for max_fail_percentage 8238 1726882376.58575: done checking for max_fail_percentage 8238 1726882376.58576: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.58577: done checking to see if all hosts have failed 8238 1726882376.58578: getting the remaining hosts for this loop 8238 1726882376.58579: done getting the remaining hosts for this loop 8238 1726882376.58582: getting the next task for host managed_node3 8238 1726882376.58588: done getting next task for host managed_node3 8238 1726882376.58590: ^ task is: TASK: Install yum-utils package 8238 1726882376.58594: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.58597: getting variables 8238 1726882376.58599: in VariableManager get_vars() 8238 1726882376.58627: Calling all_inventory to load vars for managed_node3 8238 1726882376.58630: Calling groups_inventory to load vars for managed_node3 8238 1726882376.58633: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.58643: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.58646: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.58650: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.58867: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000fd 8238 1726882376.58871: WORKER PROCESS EXITING 8238 1726882376.58895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.59117: done with get_vars() 8238 1726882376.59129: done getting variables 8238 1726882376.59227: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:32:56 -0400 (0:00:00.023) 0:00:06.747 ****** 8238 1726882376.59255: entering _queue_task() for managed_node3/package 8238 1726882376.59257: Creating lock for package 8238 1726882376.59495: worker is 1 (out of 1 available) 8238 1726882376.59508: exiting _queue_task() for managed_node3/package 8238 1726882376.59520: done queuing things up, now waiting for results queue to drain 8238 1726882376.59523: waiting for pending results... 8238 1726882376.59761: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 8238 1726882376.59874: in run() - task 0affc7ec-ae25-54bc-d334-0000000000fe 8238 1726882376.59893: variable 'ansible_search_path' from source: unknown 8238 1726882376.59901: variable 'ansible_search_path' from source: unknown 8238 1726882376.59946: calling self._execute() 8238 1726882376.60028: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.60041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.60058: variable 'omit' from source: magic vars 8238 1726882376.60446: variable 'ansible_distribution' from source: facts 8238 1726882376.60464: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8238 1726882376.60472: when evaluation is False, skipping this task 8238 1726882376.60478: _execute() done 8238 1726882376.60486: dumping result to json 8238 1726882376.60498: done dumping result, returning 8238 1726882376.60512: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0affc7ec-ae25-54bc-d334-0000000000fe] 8238 1726882376.60527: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000fe skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8238 1726882376.60679: no more pending results, returning what we have 8238 1726882376.60682: results queue empty 8238 1726882376.60683: checking for any_errors_fatal 8238 1726882376.60689: done checking for any_errors_fatal 8238 1726882376.60690: checking for max_fail_percentage 8238 1726882376.60692: done checking for max_fail_percentage 8238 1726882376.60693: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.60694: done checking to see if all hosts have failed 8238 1726882376.60694: getting the remaining hosts for this loop 8238 1726882376.60696: done getting the remaining hosts for this loop 8238 1726882376.60700: getting the next task for host managed_node3 8238 1726882376.60708: done getting next task for host managed_node3 8238 1726882376.60710: ^ task is: TASK: Enable EPEL 7 8238 1726882376.60715: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.60718: getting variables 8238 1726882376.60724: in VariableManager get_vars() 8238 1726882376.60754: Calling all_inventory to load vars for managed_node3 8238 1726882376.60757: Calling groups_inventory to load vars for managed_node3 8238 1726882376.60762: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.60775: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.60778: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.60782: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.61112: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000fe 8238 1726882376.61116: WORKER PROCESS EXITING 8238 1726882376.61142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.61338: done with get_vars() 8238 1726882376.61347: done getting variables 8238 1726882376.61404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:32:56 -0400 (0:00:00.021) 0:00:06.769 ****** 8238 1726882376.61434: entering _queue_task() for managed_node3/command 8238 1726882376.61648: worker is 1 (out of 1 available) 8238 1726882376.61658: exiting _queue_task() for managed_node3/command 8238 1726882376.61669: done queuing things up, now waiting for results queue to drain 8238 1726882376.61671: waiting for pending results... 8238 1726882376.61899: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 8238 1726882376.62105: in run() - task 0affc7ec-ae25-54bc-d334-0000000000ff 8238 1726882376.62109: variable 'ansible_search_path' from source: unknown 8238 1726882376.62113: variable 'ansible_search_path' from source: unknown 8238 1726882376.62116: calling self._execute() 8238 1726882376.62159: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.62173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.62187: variable 'omit' from source: magic vars 8238 1726882376.62562: variable 'ansible_distribution' from source: facts 8238 1726882376.62580: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8238 1726882376.62589: when evaluation is False, skipping this task 8238 1726882376.62597: _execute() done 8238 1726882376.62605: dumping result to json 8238 1726882376.62614: done dumping result, returning 8238 1726882376.62627: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0affc7ec-ae25-54bc-d334-0000000000ff] 8238 1726882376.62665: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000ff skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8238 1726882376.62870: no more pending results, returning what we have 8238 1726882376.62873: results queue empty 8238 1726882376.62874: checking for any_errors_fatal 8238 1726882376.62882: done checking for any_errors_fatal 8238 1726882376.62883: checking for max_fail_percentage 8238 1726882376.62885: done checking for max_fail_percentage 8238 1726882376.62886: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.62887: done checking to see if all hosts have failed 8238 1726882376.62887: getting the remaining hosts for this loop 8238 1726882376.62889: done getting the remaining hosts for this loop 8238 1726882376.62893: getting the next task for host managed_node3 8238 1726882376.62899: done getting next task for host managed_node3 8238 1726882376.62901: ^ task is: TASK: Enable EPEL 8 8238 1726882376.62905: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.62909: getting variables 8238 1726882376.62911: in VariableManager get_vars() 8238 1726882376.62941: Calling all_inventory to load vars for managed_node3 8238 1726882376.62944: Calling groups_inventory to load vars for managed_node3 8238 1726882376.62948: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.62959: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.62962: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.62966: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.63244: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000ff 8238 1726882376.63247: WORKER PROCESS EXITING 8238 1726882376.63271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.63470: done with get_vars() 8238 1726882376.63479: done getting variables 8238 1726882376.63539: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:32:56 -0400 (0:00:00.021) 0:00:06.790 ****** 8238 1726882376.63567: entering _queue_task() for managed_node3/command 8238 1726882376.63777: worker is 1 (out of 1 available) 8238 1726882376.63788: exiting _queue_task() for managed_node3/command 8238 1726882376.63799: done queuing things up, now waiting for results queue to drain 8238 1726882376.63801: waiting for pending results... 8238 1726882376.64030: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 8238 1726882376.64142: in run() - task 0affc7ec-ae25-54bc-d334-000000000100 8238 1726882376.64163: variable 'ansible_search_path' from source: unknown 8238 1726882376.64172: variable 'ansible_search_path' from source: unknown 8238 1726882376.64211: calling self._execute() 8238 1726882376.64291: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.64304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.64318: variable 'omit' from source: magic vars 8238 1726882376.64689: variable 'ansible_distribution' from source: facts 8238 1726882376.64709: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8238 1726882376.64717: when evaluation is False, skipping this task 8238 1726882376.64805: _execute() done 8238 1726882376.64808: dumping result to json 8238 1726882376.64811: done dumping result, returning 8238 1726882376.64814: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0affc7ec-ae25-54bc-d334-000000000100] 8238 1726882376.64816: sending task result for task 0affc7ec-ae25-54bc-d334-000000000100 8238 1726882376.64880: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000100 8238 1726882376.64884: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8238 1726882376.64955: no more pending results, returning what we have 8238 1726882376.64959: results queue empty 8238 1726882376.64960: checking for any_errors_fatal 8238 1726882376.64966: done checking for any_errors_fatal 8238 1726882376.64967: checking for max_fail_percentage 8238 1726882376.64968: done checking for max_fail_percentage 8238 1726882376.64969: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.64970: done checking to see if all hosts have failed 8238 1726882376.64971: getting the remaining hosts for this loop 8238 1726882376.64972: done getting the remaining hosts for this loop 8238 1726882376.64976: getting the next task for host managed_node3 8238 1726882376.64986: done getting next task for host managed_node3 8238 1726882376.64988: ^ task is: TASK: Enable EPEL 6 8238 1726882376.64993: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.64997: getting variables 8238 1726882376.64999: in VariableManager get_vars() 8238 1726882376.65030: Calling all_inventory to load vars for managed_node3 8238 1726882376.65033: Calling groups_inventory to load vars for managed_node3 8238 1726882376.65037: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.65050: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.65053: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.65057: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.65351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.65546: done with get_vars() 8238 1726882376.65555: done getting variables 8238 1726882376.65613: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:32:56 -0400 (0:00:00.020) 0:00:06.811 ****** 8238 1726882376.65642: entering _queue_task() for managed_node3/copy 8238 1726882376.65849: worker is 1 (out of 1 available) 8238 1726882376.65860: exiting _queue_task() for managed_node3/copy 8238 1726882376.65869: done queuing things up, now waiting for results queue to drain 8238 1726882376.65871: waiting for pending results... 8238 1726882376.66092: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 8238 1726882376.66298: in run() - task 0affc7ec-ae25-54bc-d334-000000000102 8238 1726882376.66302: variable 'ansible_search_path' from source: unknown 8238 1726882376.66305: variable 'ansible_search_path' from source: unknown 8238 1726882376.66307: calling self._execute() 8238 1726882376.66341: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.66353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.66366: variable 'omit' from source: magic vars 8238 1726882376.66734: variable 'ansible_distribution' from source: facts 8238 1726882376.66755: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8238 1726882376.66763: when evaluation is False, skipping this task 8238 1726882376.66770: _execute() done 8238 1726882376.66777: dumping result to json 8238 1726882376.66785: done dumping result, returning 8238 1726882376.66794: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0affc7ec-ae25-54bc-d334-000000000102] 8238 1726882376.66806: sending task result for task 0affc7ec-ae25-54bc-d334-000000000102 8238 1726882376.66923: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000102 8238 1726882376.66927: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8238 1726882376.67003: no more pending results, returning what we have 8238 1726882376.67007: results queue empty 8238 1726882376.67008: checking for any_errors_fatal 8238 1726882376.67012: done checking for any_errors_fatal 8238 1726882376.67013: checking for max_fail_percentage 8238 1726882376.67015: done checking for max_fail_percentage 8238 1726882376.67016: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.67017: done checking to see if all hosts have failed 8238 1726882376.67017: getting the remaining hosts for this loop 8238 1726882376.67019: done getting the remaining hosts for this loop 8238 1726882376.67025: getting the next task for host managed_node3 8238 1726882376.67035: done getting next task for host managed_node3 8238 1726882376.67038: ^ task is: TASK: Set network provider to 'nm' 8238 1726882376.67040: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.67044: getting variables 8238 1726882376.67046: in VariableManager get_vars() 8238 1726882376.67076: Calling all_inventory to load vars for managed_node3 8238 1726882376.67079: Calling groups_inventory to load vars for managed_node3 8238 1726882376.67083: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.67096: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.67099: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.67103: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.67409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.67668: done with get_vars() 8238 1726882376.67677: done getting variables 8238 1726882376.67737: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Friday 20 September 2024 21:32:56 -0400 (0:00:00.021) 0:00:06.832 ****** 8238 1726882376.67772: entering _queue_task() for managed_node3/set_fact 8238 1726882376.68061: worker is 1 (out of 1 available) 8238 1726882376.68074: exiting _queue_task() for managed_node3/set_fact 8238 1726882376.68090: done queuing things up, now waiting for results queue to drain 8238 1726882376.68092: waiting for pending results... 8238 1726882376.68341: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 8238 1726882376.68346: in run() - task 0affc7ec-ae25-54bc-d334-000000000007 8238 1726882376.68350: variable 'ansible_search_path' from source: unknown 8238 1726882376.68352: calling self._execute() 8238 1726882376.68398: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.68407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.68417: variable 'omit' from source: magic vars 8238 1726882376.68512: variable 'omit' from source: magic vars 8238 1726882376.68626: variable 'omit' from source: magic vars 8238 1726882376.68629: variable 'omit' from source: magic vars 8238 1726882376.68630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882376.68638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882376.68657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882376.68673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882376.68682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882376.68706: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882376.68714: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.68717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.68793: Set connection var ansible_connection to ssh 8238 1726882376.68796: Set connection var ansible_shell_type to sh 8238 1726882376.68799: Set connection var ansible_pipelining to False 8238 1726882376.68807: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882376.68812: Set connection var ansible_timeout to 10 8238 1726882376.68822: Set connection var ansible_shell_executable to /bin/sh 8238 1726882376.68839: variable 'ansible_shell_executable' from source: unknown 8238 1726882376.68842: variable 'ansible_connection' from source: unknown 8238 1726882376.68844: variable 'ansible_module_compression' from source: unknown 8238 1726882376.68849: variable 'ansible_shell_type' from source: unknown 8238 1726882376.68851: variable 'ansible_shell_executable' from source: unknown 8238 1726882376.68854: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.68856: variable 'ansible_pipelining' from source: unknown 8238 1726882376.68860: variable 'ansible_timeout' from source: unknown 8238 1726882376.68863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.68973: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882376.69144: variable 'omit' from source: magic vars 8238 1726882376.69147: starting attempt loop 8238 1726882376.69150: running the handler 8238 1726882376.69151: handler run complete 8238 1726882376.69153: attempt loop complete, returning result 8238 1726882376.69154: _execute() done 8238 1726882376.69155: dumping result to json 8238 1726882376.69157: done dumping result, returning 8238 1726882376.69158: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0affc7ec-ae25-54bc-d334-000000000007] 8238 1726882376.69159: sending task result for task 0affc7ec-ae25-54bc-d334-000000000007 8238 1726882376.69202: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000007 8238 1726882376.69205: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 8238 1726882376.69239: no more pending results, returning what we have 8238 1726882376.69241: results queue empty 8238 1726882376.69241: checking for any_errors_fatal 8238 1726882376.69244: done checking for any_errors_fatal 8238 1726882376.69245: checking for max_fail_percentage 8238 1726882376.69245: done checking for max_fail_percentage 8238 1726882376.69248: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.69249: done checking to see if all hosts have failed 8238 1726882376.69250: getting the remaining hosts for this loop 8238 1726882376.69252: done getting the remaining hosts for this loop 8238 1726882376.69254: getting the next task for host managed_node3 8238 1726882376.69259: done getting next task for host managed_node3 8238 1726882376.69260: ^ task is: TASK: meta (flush_handlers) 8238 1726882376.69261: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.69264: getting variables 8238 1726882376.69264: in VariableManager get_vars() 8238 1726882376.69282: Calling all_inventory to load vars for managed_node3 8238 1726882376.69283: Calling groups_inventory to load vars for managed_node3 8238 1726882376.69285: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.69292: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.69294: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.69295: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.69405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.69523: done with get_vars() 8238 1726882376.69530: done getting variables 8238 1726882376.69578: in VariableManager get_vars() 8238 1726882376.69585: Calling all_inventory to load vars for managed_node3 8238 1726882376.69587: Calling groups_inventory to load vars for managed_node3 8238 1726882376.69588: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.69591: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.69593: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.69594: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.69819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.69932: done with get_vars() 8238 1726882376.69941: done queuing things up, now waiting for results queue to drain 8238 1726882376.69943: results queue empty 8238 1726882376.69943: checking for any_errors_fatal 8238 1726882376.69944: done checking for any_errors_fatal 8238 1726882376.69945: checking for max_fail_percentage 8238 1726882376.69946: done checking for max_fail_percentage 8238 1726882376.69948: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.69949: done checking to see if all hosts have failed 8238 1726882376.69949: getting the remaining hosts for this loop 8238 1726882376.69950: done getting the remaining hosts for this loop 8238 1726882376.69951: getting the next task for host managed_node3 8238 1726882376.69954: done getting next task for host managed_node3 8238 1726882376.69954: ^ task is: TASK: meta (flush_handlers) 8238 1726882376.69955: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.69961: getting variables 8238 1726882376.69962: in VariableManager get_vars() 8238 1726882376.69967: Calling all_inventory to load vars for managed_node3 8238 1726882376.69968: Calling groups_inventory to load vars for managed_node3 8238 1726882376.69971: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.69974: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.69976: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.69978: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.70060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.70170: done with get_vars() 8238 1726882376.70188: done getting variables 8238 1726882376.70228: in VariableManager get_vars() 8238 1726882376.70234: Calling all_inventory to load vars for managed_node3 8238 1726882376.70235: Calling groups_inventory to load vars for managed_node3 8238 1726882376.70237: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.70240: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.70242: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.70243: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.70408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.70616: done with get_vars() 8238 1726882376.70631: done queuing things up, now waiting for results queue to drain 8238 1726882376.70633: results queue empty 8238 1726882376.70633: checking for any_errors_fatal 8238 1726882376.70634: done checking for any_errors_fatal 8238 1726882376.70635: checking for max_fail_percentage 8238 1726882376.70636: done checking for max_fail_percentage 8238 1726882376.70637: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.70637: done checking to see if all hosts have failed 8238 1726882376.70638: getting the remaining hosts for this loop 8238 1726882376.70639: done getting the remaining hosts for this loop 8238 1726882376.70641: getting the next task for host managed_node3 8238 1726882376.70644: done getting next task for host managed_node3 8238 1726882376.70645: ^ task is: None 8238 1726882376.70646: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.70647: done queuing things up, now waiting for results queue to drain 8238 1726882376.70648: results queue empty 8238 1726882376.70649: checking for any_errors_fatal 8238 1726882376.70654: done checking for any_errors_fatal 8238 1726882376.70655: checking for max_fail_percentage 8238 1726882376.70656: done checking for max_fail_percentage 8238 1726882376.70657: checking to see if all hosts have failed and the running result is not ok 8238 1726882376.70658: done checking to see if all hosts have failed 8238 1726882376.70660: getting the next task for host managed_node3 8238 1726882376.70667: done getting next task for host managed_node3 8238 1726882376.70668: ^ task is: None 8238 1726882376.70669: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.70710: in VariableManager get_vars() 8238 1726882376.70738: done with get_vars() 8238 1726882376.70745: in VariableManager get_vars() 8238 1726882376.70765: done with get_vars() 8238 1726882376.70774: variable 'omit' from source: magic vars 8238 1726882376.70806: in VariableManager get_vars() 8238 1726882376.70824: done with get_vars() 8238 1726882376.70845: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 8238 1726882376.71530: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 8238 1726882376.71555: getting the remaining hosts for this loop 8238 1726882376.71556: done getting the remaining hosts for this loop 8238 1726882376.71558: getting the next task for host managed_node3 8238 1726882376.71560: done getting next task for host managed_node3 8238 1726882376.71561: ^ task is: TASK: Gathering Facts 8238 1726882376.71562: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882376.71563: getting variables 8238 1726882376.71564: in VariableManager get_vars() 8238 1726882376.71572: Calling all_inventory to load vars for managed_node3 8238 1726882376.71573: Calling groups_inventory to load vars for managed_node3 8238 1726882376.71575: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882376.71578: Calling all_plugins_play to load vars for managed_node3 8238 1726882376.71588: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882376.71590: Calling groups_plugins_play to load vars for managed_node3 8238 1726882376.71676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882376.71786: done with get_vars() 8238 1726882376.71791: done getting variables 8238 1726882376.71819: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Friday 20 September 2024 21:32:56 -0400 (0:00:00.040) 0:00:06.873 ****** 8238 1726882376.71836: entering _queue_task() for managed_node3/gather_facts 8238 1726882376.71999: worker is 1 (out of 1 available) 8238 1726882376.72011: exiting _queue_task() for managed_node3/gather_facts 8238 1726882376.72025: done queuing things up, now waiting for results queue to drain 8238 1726882376.72027: waiting for pending results... 8238 1726882376.72177: running TaskExecutor() for managed_node3/TASK: Gathering Facts 8238 1726882376.72235: in run() - task 0affc7ec-ae25-54bc-d334-000000000128 8238 1726882376.72245: variable 'ansible_search_path' from source: unknown 8238 1726882376.72281: calling self._execute() 8238 1726882376.72342: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.72346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.72360: variable 'omit' from source: magic vars 8238 1726882376.72633: variable 'ansible_distribution_major_version' from source: facts 8238 1726882376.72643: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882376.72648: variable 'omit' from source: magic vars 8238 1726882376.72671: variable 'omit' from source: magic vars 8238 1726882376.72696: variable 'omit' from source: magic vars 8238 1726882376.72735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882376.72763: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882376.72778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882376.72793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882376.72805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882376.72833: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882376.72837: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.72840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.72914: Set connection var ansible_connection to ssh 8238 1726882376.72919: Set connection var ansible_shell_type to sh 8238 1726882376.72922: Set connection var ansible_pipelining to False 8238 1726882376.72924: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882376.72937: Set connection var ansible_timeout to 10 8238 1726882376.72944: Set connection var ansible_shell_executable to /bin/sh 8238 1726882376.72961: variable 'ansible_shell_executable' from source: unknown 8238 1726882376.72964: variable 'ansible_connection' from source: unknown 8238 1726882376.72967: variable 'ansible_module_compression' from source: unknown 8238 1726882376.72970: variable 'ansible_shell_type' from source: unknown 8238 1726882376.72973: variable 'ansible_shell_executable' from source: unknown 8238 1726882376.72975: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882376.72977: variable 'ansible_pipelining' from source: unknown 8238 1726882376.72980: variable 'ansible_timeout' from source: unknown 8238 1726882376.72985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882376.73125: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882376.73135: variable 'omit' from source: magic vars 8238 1726882376.73138: starting attempt loop 8238 1726882376.73142: running the handler 8238 1726882376.73162: variable 'ansible_facts' from source: unknown 8238 1726882376.73176: _low_level_execute_command(): starting 8238 1726882376.73184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882376.73845: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882376.73914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882376.73917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882376.74033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882376.75794: stdout chunk (state=3): >>>/root <<< 8238 1726882376.75999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882376.76004: stdout chunk (state=3): >>><<< 8238 1726882376.76007: stderr chunk (state=3): >>><<< 8238 1726882376.76117: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882376.76121: _low_level_execute_command(): starting 8238 1726882376.76124: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865 `" && echo ansible-tmp-1726882376.760305-8556-37451350208865="` echo /root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865 `" ) && sleep 0' 8238 1726882376.76708: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882376.76725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882376.76739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882376.76761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882376.76890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882376.76895: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882376.76948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882376.77041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882376.79032: stdout chunk (state=3): >>>ansible-tmp-1726882376.760305-8556-37451350208865=/root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865 <<< 8238 1726882376.79216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882376.79219: stdout chunk (state=3): >>><<< 8238 1726882376.79224: stderr chunk (state=3): >>><<< 8238 1726882376.79432: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882376.760305-8556-37451350208865=/root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882376.79435: variable 'ansible_module_compression' from source: unknown 8238 1726882376.79438: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8238 1726882376.79440: variable 'ansible_facts' from source: unknown 8238 1726882376.79626: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/AnsiballZ_setup.py 8238 1726882376.79794: Sending initial data 8238 1726882376.79835: Sent initial data (150 bytes) 8238 1726882376.80511: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882376.80554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882376.80568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882376.80644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882376.80688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882376.80706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882376.80738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882376.80874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882376.82449: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8238 1726882376.82473: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882376.82553: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882376.82654: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpthgvicgs /root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/AnsiballZ_setup.py <<< 8238 1726882376.82657: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/AnsiballZ_setup.py" <<< 8238 1726882376.82744: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpthgvicgs" to remote "/root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/AnsiballZ_setup.py" <<< 8238 1726882376.85039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882376.85128: stderr chunk (state=3): >>><<< 8238 1726882376.85131: stdout chunk (state=3): >>><<< 8238 1726882376.85134: done transferring module to remote 8238 1726882376.85136: _low_level_execute_command(): starting 8238 1726882376.85138: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/ /root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/AnsiballZ_setup.py && sleep 0' 8238 1726882376.86056: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882376.86060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882376.86063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882376.86075: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882376.86150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882376.86153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882376.86253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882376.88403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882376.88406: stderr chunk (state=3): >>><<< 8238 1726882376.88413: stdout chunk (state=3): >>><<< 8238 1726882376.88433: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882376.88443: _low_level_execute_command(): starting 8238 1726882376.88457: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/AnsiballZ_setup.py && sleep 0' 8238 1726882376.89482: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882376.89508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882376.89527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882376.89552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882376.89619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882376.89685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882376.89729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882376.89768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882376.89950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882378.99162: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto"<<< 8238 1726882378.99173: stdout chunk (state=3): >>>, "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3109, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 607, "free": 3109}, "nocache": {"free": 3484, "used": 232}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 523, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251394269184, "block_size": 4096, "block_total": 64483404, "block_available": 61375554, "block_used": 3107850, "inode_total": 16384000, "inode_available": 16303148, "inode_used": 80852, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_loadavg": {"1m": 0.654296875, "5m": 0.34228515625, "15m": 0.15673828125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "58", "epoch": "1726882378", "epoch_int": "1726882378", "date": "2024-09-20", "time": "21:32:58", "iso8601_micro": "2024-09-21T01:32:58.961181Z", "iso8601": "2024-09-21T01:32:58Z", "iso8601_basic": "20240920T213258961181", "iso8601_basic_short": "20240920T213258", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "<<< 8238 1726882378.99189: stdout chunk (state=3): >>>02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8238 1726882379.01432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882379.01436: stdout chunk (state=3): >>><<< 8238 1726882379.01438: stderr chunk (state=3): >>><<< 8238 1726882379.01442: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3109, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 607, "free": 3109}, "nocache": {"free": 3484, "used": 232}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 523, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251394269184, "block_size": 4096, "block_total": 64483404, "block_available": 61375554, "block_used": 3107850, "inode_total": 16384000, "inode_available": 16303148, "inode_used": 80852, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_loadavg": {"1m": 0.654296875, "5m": 0.34228515625, "15m": 0.15673828125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "58", "epoch": "1726882378", "epoch_int": "1726882378", "date": "2024-09-20", "time": "21:32:58", "iso8601_micro": "2024-09-21T01:32:58.961181Z", "iso8601": "2024-09-21T01:32:58Z", "iso8601_basic": "20240920T213258961181", "iso8601_basic_short": "20240920T213258", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882379.01763: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882379.01800: _low_level_execute_command(): starting 8238 1726882379.01810: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882376.760305-8556-37451350208865/ > /dev/null 2>&1 && sleep 0' 8238 1726882379.02520: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882379.02539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882379.02640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882379.02675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882379.02691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882379.02715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882379.02831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882379.04780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882379.04797: stdout chunk (state=3): >>><<< 8238 1726882379.04811: stderr chunk (state=3): >>><<< 8238 1726882379.04833: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882379.05032: handler run complete 8238 1726882379.05036: variable 'ansible_facts' from source: unknown 8238 1726882379.05118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882379.05512: variable 'ansible_facts' from source: unknown 8238 1726882379.05617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882379.05973: attempt loop complete, returning result 8238 1726882379.06043: _execute() done 8238 1726882379.06056: dumping result to json 8238 1726882379.06238: done dumping result, returning 8238 1726882379.06241: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-54bc-d334-000000000128] 8238 1726882379.06244: sending task result for task 0affc7ec-ae25-54bc-d334-000000000128 8238 1726882379.07058: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000128 8238 1726882379.07061: WORKER PROCESS EXITING ok: [managed_node3] 8238 1726882379.07685: no more pending results, returning what we have 8238 1726882379.07688: results queue empty 8238 1726882379.07689: checking for any_errors_fatal 8238 1726882379.07690: done checking for any_errors_fatal 8238 1726882379.07724: checking for max_fail_percentage 8238 1726882379.07727: done checking for max_fail_percentage 8238 1726882379.07728: checking to see if all hosts have failed and the running result is not ok 8238 1726882379.07729: done checking to see if all hosts have failed 8238 1726882379.07730: getting the remaining hosts for this loop 8238 1726882379.07731: done getting the remaining hosts for this loop 8238 1726882379.07735: getting the next task for host managed_node3 8238 1726882379.07741: done getting next task for host managed_node3 8238 1726882379.07744: ^ task is: TASK: meta (flush_handlers) 8238 1726882379.07746: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882379.07752: getting variables 8238 1726882379.07754: in VariableManager get_vars() 8238 1726882379.07786: Calling all_inventory to load vars for managed_node3 8238 1726882379.07789: Calling groups_inventory to load vars for managed_node3 8238 1726882379.07791: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882379.07920: Calling all_plugins_play to load vars for managed_node3 8238 1726882379.07929: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882379.07934: Calling groups_plugins_play to load vars for managed_node3 8238 1726882379.08219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882379.08810: done with get_vars() 8238 1726882379.08824: done getting variables 8238 1726882379.09039: in VariableManager get_vars() 8238 1726882379.09060: Calling all_inventory to load vars for managed_node3 8238 1726882379.09063: Calling groups_inventory to load vars for managed_node3 8238 1726882379.09065: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882379.09079: Calling all_plugins_play to load vars for managed_node3 8238 1726882379.09082: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882379.09085: Calling groups_plugins_play to load vars for managed_node3 8238 1726882379.09292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882379.09829: done with get_vars() 8238 1726882379.09850: done queuing things up, now waiting for results queue to drain 8238 1726882379.09852: results queue empty 8238 1726882379.09853: checking for any_errors_fatal 8238 1726882379.09857: done checking for any_errors_fatal 8238 1726882379.09858: checking for max_fail_percentage 8238 1726882379.09861: done checking for max_fail_percentage 8238 1726882379.09866: checking to see if all hosts have failed and the running result is not ok 8238 1726882379.09867: done checking to see if all hosts have failed 8238 1726882379.09867: getting the remaining hosts for this loop 8238 1726882379.09868: done getting the remaining hosts for this loop 8238 1726882379.09872: getting the next task for host managed_node3 8238 1726882379.09880: done getting next task for host managed_node3 8238 1726882379.09883: ^ task is: TASK: INIT Prepare setup 8238 1726882379.09884: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882379.09887: getting variables 8238 1726882379.09888: in VariableManager get_vars() 8238 1726882379.09901: Calling all_inventory to load vars for managed_node3 8238 1726882379.09903: Calling groups_inventory to load vars for managed_node3 8238 1726882379.09905: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882379.09910: Calling all_plugins_play to load vars for managed_node3 8238 1726882379.09912: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882379.09915: Calling groups_plugins_play to load vars for managed_node3 8238 1726882379.10291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882379.10739: done with get_vars() 8238 1726882379.10755: done getting variables 8238 1726882379.10905: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Friday 20 September 2024 21:32:59 -0400 (0:00:02.390) 0:00:09.264 ****** 8238 1726882379.10936: entering _queue_task() for managed_node3/debug 8238 1726882379.10938: Creating lock for debug 8238 1726882379.11264: worker is 1 (out of 1 available) 8238 1726882379.11276: exiting _queue_task() for managed_node3/debug 8238 1726882379.11289: done queuing things up, now waiting for results queue to drain 8238 1726882379.11291: waiting for pending results... 8238 1726882379.11742: running TaskExecutor() for managed_node3/TASK: INIT Prepare setup 8238 1726882379.12332: in run() - task 0affc7ec-ae25-54bc-d334-00000000000b 8238 1726882379.12337: variable 'ansible_search_path' from source: unknown 8238 1726882379.12340: calling self._execute() 8238 1726882379.12415: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882379.12437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882379.12458: variable 'omit' from source: magic vars 8238 1726882379.13018: variable 'ansible_distribution_major_version' from source: facts 8238 1726882379.13040: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882379.13058: variable 'omit' from source: magic vars 8238 1726882379.13100: variable 'omit' from source: magic vars 8238 1726882379.13184: variable 'omit' from source: magic vars 8238 1726882379.13211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882379.13260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882379.13293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882379.13391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882379.13400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882379.13404: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882379.13407: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882379.13409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882379.13465: Set connection var ansible_connection to ssh 8238 1726882379.13468: Set connection var ansible_shell_type to sh 8238 1726882379.13471: Set connection var ansible_pipelining to False 8238 1726882379.13477: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882379.13483: Set connection var ansible_timeout to 10 8238 1726882379.13490: Set connection var ansible_shell_executable to /bin/sh 8238 1726882379.13508: variable 'ansible_shell_executable' from source: unknown 8238 1726882379.13511: variable 'ansible_connection' from source: unknown 8238 1726882379.13514: variable 'ansible_module_compression' from source: unknown 8238 1726882379.13517: variable 'ansible_shell_type' from source: unknown 8238 1726882379.13520: variable 'ansible_shell_executable' from source: unknown 8238 1726882379.13524: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882379.13536: variable 'ansible_pipelining' from source: unknown 8238 1726882379.13539: variable 'ansible_timeout' from source: unknown 8238 1726882379.13543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882379.13651: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882379.13658: variable 'omit' from source: magic vars 8238 1726882379.13662: starting attempt loop 8238 1726882379.13665: running the handler 8238 1726882379.13701: handler run complete 8238 1726882379.13720: attempt loop complete, returning result 8238 1726882379.13723: _execute() done 8238 1726882379.13738: dumping result to json 8238 1726882379.13741: done dumping result, returning 8238 1726882379.13744: done running TaskExecutor() for managed_node3/TASK: INIT Prepare setup [0affc7ec-ae25-54bc-d334-00000000000b] 8238 1726882379.13747: sending task result for task 0affc7ec-ae25-54bc-d334-00000000000b 8238 1726882379.13838: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000000b 8238 1726882379.13842: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 8238 1726882379.13901: no more pending results, returning what we have 8238 1726882379.13905: results queue empty 8238 1726882379.13906: checking for any_errors_fatal 8238 1726882379.13910: done checking for any_errors_fatal 8238 1726882379.13911: checking for max_fail_percentage 8238 1726882379.13912: done checking for max_fail_percentage 8238 1726882379.13913: checking to see if all hosts have failed and the running result is not ok 8238 1726882379.13914: done checking to see if all hosts have failed 8238 1726882379.13914: getting the remaining hosts for this loop 8238 1726882379.13916: done getting the remaining hosts for this loop 8238 1726882379.13920: getting the next task for host managed_node3 8238 1726882379.13930: done getting next task for host managed_node3 8238 1726882379.13934: ^ task is: TASK: Install dnsmasq 8238 1726882379.13937: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882379.13941: getting variables 8238 1726882379.13942: in VariableManager get_vars() 8238 1726882379.14057: Calling all_inventory to load vars for managed_node3 8238 1726882379.14060: Calling groups_inventory to load vars for managed_node3 8238 1726882379.14062: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882379.14071: Calling all_plugins_play to load vars for managed_node3 8238 1726882379.14074: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882379.14076: Calling groups_plugins_play to load vars for managed_node3 8238 1726882379.14249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882379.14479: done with get_vars() 8238 1726882379.14489: done getting variables 8238 1726882379.14558: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:32:59 -0400 (0:00:00.036) 0:00:09.301 ****** 8238 1726882379.14597: entering _queue_task() for managed_node3/package 8238 1726882379.14864: worker is 1 (out of 1 available) 8238 1726882379.14879: exiting _queue_task() for managed_node3/package 8238 1726882379.14891: done queuing things up, now waiting for results queue to drain 8238 1726882379.14892: waiting for pending results... 8238 1726882379.15149: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 8238 1726882379.15232: in run() - task 0affc7ec-ae25-54bc-d334-00000000000f 8238 1726882379.15259: variable 'ansible_search_path' from source: unknown 8238 1726882379.15281: variable 'ansible_search_path' from source: unknown 8238 1726882379.15358: calling self._execute() 8238 1726882379.15414: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882379.15430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882379.15444: variable 'omit' from source: magic vars 8238 1726882379.15838: variable 'ansible_distribution_major_version' from source: facts 8238 1726882379.15901: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882379.15905: variable 'omit' from source: magic vars 8238 1726882379.15928: variable 'omit' from source: magic vars 8238 1726882379.16144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882379.17672: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882379.17716: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882379.17745: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882379.17782: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882379.17804: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882379.17881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882379.17901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882379.17919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882379.17952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882379.17962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882379.18039: variable '__network_is_ostree' from source: set_fact 8238 1726882379.18043: variable 'omit' from source: magic vars 8238 1726882379.18066: variable 'omit' from source: magic vars 8238 1726882379.18090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882379.18112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882379.18126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882379.18326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882379.18330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882379.18332: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882379.18335: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882379.18337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882379.18339: Set connection var ansible_connection to ssh 8238 1726882379.18341: Set connection var ansible_shell_type to sh 8238 1726882379.18344: Set connection var ansible_pipelining to False 8238 1726882379.18346: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882379.18349: Set connection var ansible_timeout to 10 8238 1726882379.18353: Set connection var ansible_shell_executable to /bin/sh 8238 1726882379.18379: variable 'ansible_shell_executable' from source: unknown 8238 1726882379.18387: variable 'ansible_connection' from source: unknown 8238 1726882379.18394: variable 'ansible_module_compression' from source: unknown 8238 1726882379.18403: variable 'ansible_shell_type' from source: unknown 8238 1726882379.18412: variable 'ansible_shell_executable' from source: unknown 8238 1726882379.18419: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882379.18430: variable 'ansible_pipelining' from source: unknown 8238 1726882379.18437: variable 'ansible_timeout' from source: unknown 8238 1726882379.18444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882379.18555: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882379.18572: variable 'omit' from source: magic vars 8238 1726882379.18583: starting attempt loop 8238 1726882379.18589: running the handler 8238 1726882379.18599: variable 'ansible_facts' from source: unknown 8238 1726882379.18606: variable 'ansible_facts' from source: unknown 8238 1726882379.18649: _low_level_execute_command(): starting 8238 1726882379.18661: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882379.19302: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882379.19325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882379.19342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882379.19359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882379.19378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882379.19417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882379.19432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882379.19527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882379.21206: stdout chunk (state=3): >>>/root <<< 8238 1726882379.21319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882379.21366: stderr chunk (state=3): >>><<< 8238 1726882379.21370: stdout chunk (state=3): >>><<< 8238 1726882379.21387: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882379.21397: _low_level_execute_command(): starting 8238 1726882379.21407: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259 `" && echo ansible-tmp-1726882379.2138681-8662-40248107798259="` echo /root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259 `" ) && sleep 0' 8238 1726882379.21816: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882379.21856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882379.21860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882379.21863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882379.21866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882379.21868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882379.21870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882379.21913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882379.21916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882379.22006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882379.23953: stdout chunk (state=3): >>>ansible-tmp-1726882379.2138681-8662-40248107798259=/root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259 <<< 8238 1726882379.24070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882379.24113: stderr chunk (state=3): >>><<< 8238 1726882379.24116: stdout chunk (state=3): >>><<< 8238 1726882379.24132: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882379.2138681-8662-40248107798259=/root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882379.24160: variable 'ansible_module_compression' from source: unknown 8238 1726882379.24203: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 8238 1726882379.24207: ANSIBALLZ: Acquiring lock 8238 1726882379.24209: ANSIBALLZ: Lock acquired: 140036204254016 8238 1726882379.24212: ANSIBALLZ: Creating module 8238 1726882379.35484: ANSIBALLZ: Writing module into payload 8238 1726882379.35627: ANSIBALLZ: Writing module 8238 1726882379.35650: ANSIBALLZ: Renaming module 8238 1726882379.35654: ANSIBALLZ: Done creating module 8238 1726882379.35673: variable 'ansible_facts' from source: unknown 8238 1726882379.35735: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/AnsiballZ_dnf.py 8238 1726882379.35846: Sending initial data 8238 1726882379.35849: Sent initial data (149 bytes) 8238 1726882379.36352: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882379.36355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882379.36358: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882379.36360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882379.36362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882379.36414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882379.36417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882379.36420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882379.36515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882379.38328: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882379.38419: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882379.38515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpn6ei7wi6 /root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/AnsiballZ_dnf.py <<< 8238 1726882379.38519: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/AnsiballZ_dnf.py" <<< 8238 1726882379.38612: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpn6ei7wi6" to remote "/root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/AnsiballZ_dnf.py" <<< 8238 1726882379.39557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882379.39616: stderr chunk (state=3): >>><<< 8238 1726882379.39619: stdout chunk (state=3): >>><<< 8238 1726882379.39644: done transferring module to remote 8238 1726882379.39655: _low_level_execute_command(): starting 8238 1726882379.39660: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/ /root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/AnsiballZ_dnf.py && sleep 0' 8238 1726882379.40109: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882379.40114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882379.40116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882379.40118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882379.40121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882379.40174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882379.40178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882379.40270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882379.42157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882379.42169: stderr chunk (state=3): >>><<< 8238 1726882379.42178: stdout chunk (state=3): >>><<< 8238 1726882379.42205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882379.42213: _low_level_execute_command(): starting 8238 1726882379.42224: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/AnsiballZ_dnf.py && sleep 0' 8238 1726882379.42848: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882379.42937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882379.42970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882379.42986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882379.43003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882379.43117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882380.50672: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 8238 1726882380.55479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882380.55484: stdout chunk (state=3): >>><<< 8238 1726882380.55486: stderr chunk (state=3): >>><<< 8238 1726882380.55629: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882380.55637: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882380.55639: _low_level_execute_command(): starting 8238 1726882380.55641: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882379.2138681-8662-40248107798259/ > /dev/null 2>&1 && sleep 0' 8238 1726882380.56257: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882380.56270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882380.56285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882380.56400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882380.56431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882380.56559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882380.58635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882380.58638: stdout chunk (state=3): >>><<< 8238 1726882380.58641: stderr chunk (state=3): >>><<< 8238 1726882380.58660: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882380.58673: handler run complete 8238 1726882380.58870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882380.59228: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882380.59231: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882380.59234: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882380.59236: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882380.59280: variable '__install_status' from source: unknown 8238 1726882380.59305: Evaluated conditional (__install_status is success): True 8238 1726882380.59328: attempt loop complete, returning result 8238 1726882380.59336: _execute() done 8238 1726882380.59344: dumping result to json 8238 1726882380.59365: done dumping result, returning 8238 1726882380.59377: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [0affc7ec-ae25-54bc-d334-00000000000f] 8238 1726882380.59386: sending task result for task 0affc7ec-ae25-54bc-d334-00000000000f 8238 1726882380.59751: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000000f 8238 1726882380.59755: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8238 1726882380.59857: no more pending results, returning what we have 8238 1726882380.59861: results queue empty 8238 1726882380.59862: checking for any_errors_fatal 8238 1726882380.59873: done checking for any_errors_fatal 8238 1726882380.59874: checking for max_fail_percentage 8238 1726882380.59876: done checking for max_fail_percentage 8238 1726882380.59877: checking to see if all hosts have failed and the running result is not ok 8238 1726882380.59878: done checking to see if all hosts have failed 8238 1726882380.59879: getting the remaining hosts for this loop 8238 1726882380.59881: done getting the remaining hosts for this loop 8238 1726882380.59884: getting the next task for host managed_node3 8238 1726882380.59890: done getting next task for host managed_node3 8238 1726882380.59893: ^ task is: TASK: Install pgrep, sysctl 8238 1726882380.59897: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882380.59900: getting variables 8238 1726882380.59902: in VariableManager get_vars() 8238 1726882380.60013: Calling all_inventory to load vars for managed_node3 8238 1726882380.60016: Calling groups_inventory to load vars for managed_node3 8238 1726882380.60018: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882380.60031: Calling all_plugins_play to load vars for managed_node3 8238 1726882380.60034: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882380.60037: Calling groups_plugins_play to load vars for managed_node3 8238 1726882380.60218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882380.60678: done with get_vars() 8238 1726882380.60689: done getting variables 8238 1726882380.60754: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:33:00 -0400 (0:00:01.461) 0:00:10.762 ****** 8238 1726882380.60792: entering _queue_task() for managed_node3/package 8238 1726882380.61112: worker is 1 (out of 1 available) 8238 1726882380.61125: exiting _queue_task() for managed_node3/package 8238 1726882380.61137: done queuing things up, now waiting for results queue to drain 8238 1726882380.61139: waiting for pending results... 8238 1726882380.61354: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 8238 1726882380.61480: in run() - task 0affc7ec-ae25-54bc-d334-000000000010 8238 1726882380.61503: variable 'ansible_search_path' from source: unknown 8238 1726882380.61527: variable 'ansible_search_path' from source: unknown 8238 1726882380.61568: calling self._execute() 8238 1726882380.61729: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882380.61733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882380.61736: variable 'omit' from source: magic vars 8238 1726882380.62129: variable 'ansible_distribution_major_version' from source: facts 8238 1726882380.62151: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882380.62233: variable 'ansible_os_family' from source: facts 8238 1726882380.62237: Evaluated conditional (ansible_os_family == 'RedHat'): True 8238 1726882380.62364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882380.62566: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882380.62598: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882380.62627: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882380.62654: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882380.62716: variable 'ansible_distribution_major_version' from source: facts 8238 1726882380.62728: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 8238 1726882380.62731: when evaluation is False, skipping this task 8238 1726882380.62734: _execute() done 8238 1726882380.62736: dumping result to json 8238 1726882380.62741: done dumping result, returning 8238 1726882380.62745: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affc7ec-ae25-54bc-d334-000000000010] 8238 1726882380.62828: sending task result for task 0affc7ec-ae25-54bc-d334-000000000010 8238 1726882380.62902: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000010 8238 1726882380.62905: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 8238 1726882380.62946: no more pending results, returning what we have 8238 1726882380.62951: results queue empty 8238 1726882380.62952: checking for any_errors_fatal 8238 1726882380.62958: done checking for any_errors_fatal 8238 1726882380.62958: checking for max_fail_percentage 8238 1726882380.62959: done checking for max_fail_percentage 8238 1726882380.62960: checking to see if all hosts have failed and the running result is not ok 8238 1726882380.62961: done checking to see if all hosts have failed 8238 1726882380.62961: getting the remaining hosts for this loop 8238 1726882380.62962: done getting the remaining hosts for this loop 8238 1726882380.62964: getting the next task for host managed_node3 8238 1726882380.62968: done getting next task for host managed_node3 8238 1726882380.62970: ^ task is: TASK: Install pgrep, sysctl 8238 1726882380.62972: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882380.62974: getting variables 8238 1726882380.62975: in VariableManager get_vars() 8238 1726882380.62999: Calling all_inventory to load vars for managed_node3 8238 1726882380.63001: Calling groups_inventory to load vars for managed_node3 8238 1726882380.63003: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882380.63009: Calling all_plugins_play to load vars for managed_node3 8238 1726882380.63011: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882380.63013: Calling groups_plugins_play to load vars for managed_node3 8238 1726882380.63125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882380.63249: done with get_vars() 8238 1726882380.63257: done getting variables 8238 1726882380.63297: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:33:00 -0400 (0:00:00.025) 0:00:10.788 ****** 8238 1726882380.63317: entering _queue_task() for managed_node3/package 8238 1726882380.63500: worker is 1 (out of 1 available) 8238 1726882380.63514: exiting _queue_task() for managed_node3/package 8238 1726882380.63527: done queuing things up, now waiting for results queue to drain 8238 1726882380.63529: waiting for pending results... 8238 1726882380.63665: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 8238 1726882380.63732: in run() - task 0affc7ec-ae25-54bc-d334-000000000011 8238 1726882380.63743: variable 'ansible_search_path' from source: unknown 8238 1726882380.63752: variable 'ansible_search_path' from source: unknown 8238 1726882380.63777: calling self._execute() 8238 1726882380.63834: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882380.63838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882380.63847: variable 'omit' from source: magic vars 8238 1726882380.64105: variable 'ansible_distribution_major_version' from source: facts 8238 1726882380.64114: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882380.64195: variable 'ansible_os_family' from source: facts 8238 1726882380.64205: Evaluated conditional (ansible_os_family == 'RedHat'): True 8238 1726882380.64335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882380.64583: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882380.64615: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882380.64643: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882380.64672: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882380.64724: variable 'ansible_distribution_major_version' from source: facts 8238 1726882380.64736: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 8238 1726882380.64739: variable 'omit' from source: magic vars 8238 1726882380.64776: variable 'omit' from source: magic vars 8238 1726882380.64883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882380.66275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882380.66316: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882380.66427: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882380.66430: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882380.66432: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882380.66500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882380.66533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882380.66565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882380.66613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882380.66637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882380.66743: variable '__network_is_ostree' from source: set_fact 8238 1726882380.66758: variable 'omit' from source: magic vars 8238 1726882380.66792: variable 'omit' from source: magic vars 8238 1726882380.66831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882380.66869: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882380.66892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882380.66915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882380.66933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882380.66974: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882380.67027: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882380.67030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882380.67105: Set connection var ansible_connection to ssh 8238 1726882380.67113: Set connection var ansible_shell_type to sh 8238 1726882380.67127: Set connection var ansible_pipelining to False 8238 1726882380.67138: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882380.67153: Set connection var ansible_timeout to 10 8238 1726882380.67167: Set connection var ansible_shell_executable to /bin/sh 8238 1726882380.67227: variable 'ansible_shell_executable' from source: unknown 8238 1726882380.67230: variable 'ansible_connection' from source: unknown 8238 1726882380.67233: variable 'ansible_module_compression' from source: unknown 8238 1726882380.67235: variable 'ansible_shell_type' from source: unknown 8238 1726882380.67237: variable 'ansible_shell_executable' from source: unknown 8238 1726882380.67240: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882380.67242: variable 'ansible_pipelining' from source: unknown 8238 1726882380.67244: variable 'ansible_timeout' from source: unknown 8238 1726882380.67246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882380.67357: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882380.67427: variable 'omit' from source: magic vars 8238 1726882380.67430: starting attempt loop 8238 1726882380.67433: running the handler 8238 1726882380.67436: variable 'ansible_facts' from source: unknown 8238 1726882380.67438: variable 'ansible_facts' from source: unknown 8238 1726882380.67442: _low_level_execute_command(): starting 8238 1726882380.67457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882380.68175: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882380.68193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882380.68210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882380.68231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882380.68252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882380.68265: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882380.68279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882380.68303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882380.68333: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882380.68350: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8238 1726882380.68365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882380.68380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882380.68408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882380.68488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882380.68516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882380.68641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882380.70411: stdout chunk (state=3): >>>/root <<< 8238 1726882380.70518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882380.70569: stderr chunk (state=3): >>><<< 8238 1726882380.70573: stdout chunk (state=3): >>><<< 8238 1726882380.70593: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882380.70603: _low_level_execute_command(): starting 8238 1726882380.70608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536 `" && echo ansible-tmp-1726882380.7059245-8703-41594550986536="` echo /root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536 `" ) && sleep 0' 8238 1726882380.71056: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882380.71059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882380.71064: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882380.71066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882380.71130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882380.71133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882380.71210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882380.73212: stdout chunk (state=3): >>>ansible-tmp-1726882380.7059245-8703-41594550986536=/root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536 <<< 8238 1726882380.73328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882380.73373: stderr chunk (state=3): >>><<< 8238 1726882380.73377: stdout chunk (state=3): >>><<< 8238 1726882380.73388: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882380.7059245-8703-41594550986536=/root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882380.73414: variable 'ansible_module_compression' from source: unknown 8238 1726882380.73462: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 8238 1726882380.73499: variable 'ansible_facts' from source: unknown 8238 1726882380.73583: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/AnsiballZ_dnf.py 8238 1726882380.73679: Sending initial data 8238 1726882380.73682: Sent initial data (149 bytes) 8238 1726882380.74110: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882380.74114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882380.74121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882380.74132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882380.74135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882380.74197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882380.74204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882380.74303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882380.75946: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 8238 1726882380.75951: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882380.76026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882380.76113: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpkumus57b /root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/AnsiballZ_dnf.py <<< 8238 1726882380.76115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/AnsiballZ_dnf.py" <<< 8238 1726882380.76193: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpkumus57b" to remote "/root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/AnsiballZ_dnf.py" <<< 8238 1726882380.77298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882380.77329: stderr chunk (state=3): >>><<< 8238 1726882380.77332: stdout chunk (state=3): >>><<< 8238 1726882380.77446: done transferring module to remote 8238 1726882380.77451: _low_level_execute_command(): starting 8238 1726882380.77455: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/ /root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/AnsiballZ_dnf.py && sleep 0' 8238 1726882380.78112: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882380.78116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882380.78119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882380.78195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882380.80061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882380.80101: stderr chunk (state=3): >>><<< 8238 1726882380.80104: stdout chunk (state=3): >>><<< 8238 1726882380.80115: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882380.80118: _low_level_execute_command(): starting 8238 1726882380.80125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/AnsiballZ_dnf.py && sleep 0' 8238 1726882380.80586: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882380.80589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882380.80685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882380.80770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882381.89168: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 8238 1726882381.93625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882381.93820: stderr chunk (state=3): >>><<< 8238 1726882381.93826: stdout chunk (state=3): >>><<< 8238 1726882381.93830: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882381.93833: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882381.93836: _low_level_execute_command(): starting 8238 1726882381.93838: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882380.7059245-8703-41594550986536/ > /dev/null 2>&1 && sleep 0' 8238 1726882381.94606: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882381.94633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882381.94654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882381.94703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882381.94714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882381.94755: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882381.94865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882381.94869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882381.94993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882381.97232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882381.97236: stdout chunk (state=3): >>><<< 8238 1726882381.97239: stderr chunk (state=3): >>><<< 8238 1726882381.97242: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882381.97250: handler run complete 8238 1726882381.97254: attempt loop complete, returning result 8238 1726882381.97256: _execute() done 8238 1726882381.97258: dumping result to json 8238 1726882381.97260: done dumping result, returning 8238 1726882381.97262: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affc7ec-ae25-54bc-d334-000000000011] 8238 1726882381.97265: sending task result for task 0affc7ec-ae25-54bc-d334-000000000011 8238 1726882381.97530: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000011 8238 1726882381.97534: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8238 1726882381.97629: no more pending results, returning what we have 8238 1726882381.97633: results queue empty 8238 1726882381.97634: checking for any_errors_fatal 8238 1726882381.97640: done checking for any_errors_fatal 8238 1726882381.97641: checking for max_fail_percentage 8238 1726882381.97643: done checking for max_fail_percentage 8238 1726882381.97644: checking to see if all hosts have failed and the running result is not ok 8238 1726882381.97645: done checking to see if all hosts have failed 8238 1726882381.97646: getting the remaining hosts for this loop 8238 1726882381.97650: done getting the remaining hosts for this loop 8238 1726882381.97655: getting the next task for host managed_node3 8238 1726882381.97662: done getting next task for host managed_node3 8238 1726882381.97665: ^ task is: TASK: Create test interfaces 8238 1726882381.97669: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882381.97678: getting variables 8238 1726882381.97681: in VariableManager get_vars() 8238 1726882381.97840: Calling all_inventory to load vars for managed_node3 8238 1726882381.97843: Calling groups_inventory to load vars for managed_node3 8238 1726882381.97846: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882381.97860: Calling all_plugins_play to load vars for managed_node3 8238 1726882381.97865: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882381.97869: Calling groups_plugins_play to load vars for managed_node3 8238 1726882381.98211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882381.98431: done with get_vars() 8238 1726882381.98591: done getting variables 8238 1726882381.98884: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:33:01 -0400 (0:00:01.355) 0:00:12.144 ****** 8238 1726882381.98917: entering _queue_task() for managed_node3/shell 8238 1726882381.98919: Creating lock for shell 8238 1726882381.99713: worker is 1 (out of 1 available) 8238 1726882381.99841: exiting _queue_task() for managed_node3/shell 8238 1726882381.99863: done queuing things up, now waiting for results queue to drain 8238 1726882381.99865: waiting for pending results... 8238 1726882381.99995: running TaskExecutor() for managed_node3/TASK: Create test interfaces 8238 1726882382.00208: in run() - task 0affc7ec-ae25-54bc-d334-000000000012 8238 1726882382.00232: variable 'ansible_search_path' from source: unknown 8238 1726882382.00239: variable 'ansible_search_path' from source: unknown 8238 1726882382.00282: calling self._execute() 8238 1726882382.00374: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882382.00747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882382.00750: variable 'omit' from source: magic vars 8238 1726882382.01248: variable 'ansible_distribution_major_version' from source: facts 8238 1726882382.01627: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882382.01631: variable 'omit' from source: magic vars 8238 1726882382.01634: variable 'omit' from source: magic vars 8238 1726882382.02140: variable 'dhcp_interface1' from source: play vars 8238 1726882382.02153: variable 'dhcp_interface2' from source: play vars 8238 1726882382.02195: variable 'omit' from source: magic vars 8238 1726882382.02247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882382.02292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882382.02318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882382.02346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882382.02365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882382.02404: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882382.02414: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882382.02424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882382.02539: Set connection var ansible_connection to ssh 8238 1726882382.02547: Set connection var ansible_shell_type to sh 8238 1726882382.02558: Set connection var ansible_pipelining to False 8238 1726882382.02570: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882382.02580: Set connection var ansible_timeout to 10 8238 1726882382.02593: Set connection var ansible_shell_executable to /bin/sh 8238 1726882382.02628: variable 'ansible_shell_executable' from source: unknown 8238 1726882382.02636: variable 'ansible_connection' from source: unknown 8238 1726882382.02644: variable 'ansible_module_compression' from source: unknown 8238 1726882382.02650: variable 'ansible_shell_type' from source: unknown 8238 1726882382.02657: variable 'ansible_shell_executable' from source: unknown 8238 1726882382.02666: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882382.02676: variable 'ansible_pipelining' from source: unknown 8238 1726882382.02683: variable 'ansible_timeout' from source: unknown 8238 1726882382.02692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882382.02852: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882382.02871: variable 'omit' from source: magic vars 8238 1726882382.02882: starting attempt loop 8238 1726882382.02889: running the handler 8238 1726882382.02902: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882382.02929: _low_level_execute_command(): starting 8238 1726882382.02943: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882382.03686: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882382.03704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882382.03719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882382.03742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882382.03759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882382.03769: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882382.03780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882382.03794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882382.03805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882382.03814: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8238 1726882382.03927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882382.03931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882382.03972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882382.04055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882382.05770: stdout chunk (state=3): >>>/root <<< 8238 1726882382.05953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882382.05967: stdout chunk (state=3): >>><<< 8238 1726882382.05981: stderr chunk (state=3): >>><<< 8238 1726882382.06015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882382.06039: _low_level_execute_command(): starting 8238 1726882382.06051: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363 `" && echo ansible-tmp-1726882382.0602531-8758-154066226739363="` echo /root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363 `" ) && sleep 0' 8238 1726882382.06741: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882382.06756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882382.06772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882382.06893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882382.06908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882382.06933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882382.06950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882382.07068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882382.09062: stdout chunk (state=3): >>>ansible-tmp-1726882382.0602531-8758-154066226739363=/root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363 <<< 8238 1726882382.09244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882382.09288: stderr chunk (state=3): >>><<< 8238 1726882382.09301: stdout chunk (state=3): >>><<< 8238 1726882382.09335: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882382.0602531-8758-154066226739363=/root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882382.09387: variable 'ansible_module_compression' from source: unknown 8238 1726882382.09456: ANSIBALLZ: Using generic lock for ansible.legacy.command 8238 1726882382.09476: ANSIBALLZ: Acquiring lock 8238 1726882382.09479: ANSIBALLZ: Lock acquired: 140036204254016 8238 1726882382.09527: ANSIBALLZ: Creating module 8238 1726882382.22044: ANSIBALLZ: Writing module into payload 8238 1726882382.22155: ANSIBALLZ: Writing module 8238 1726882382.22187: ANSIBALLZ: Renaming module 8238 1726882382.22200: ANSIBALLZ: Done creating module 8238 1726882382.22327: variable 'ansible_facts' from source: unknown 8238 1726882382.22331: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/AnsiballZ_command.py 8238 1726882382.22562: Sending initial data 8238 1726882382.22572: Sent initial data (154 bytes) 8238 1726882382.23206: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882382.23246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882382.23276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882382.23372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882382.23396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882382.23517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882382.25267: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882382.25335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882382.25429: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp32vp39do /root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/AnsiballZ_command.py <<< 8238 1726882382.25433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/AnsiballZ_command.py" <<< 8238 1726882382.25514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp32vp39do" to remote "/root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/AnsiballZ_command.py" <<< 8238 1726882382.26746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882382.26987: stderr chunk (state=3): >>><<< 8238 1726882382.26991: stdout chunk (state=3): >>><<< 8238 1726882382.26993: done transferring module to remote 8238 1726882382.26995: _low_level_execute_command(): starting 8238 1726882382.26998: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/ /root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/AnsiballZ_command.py && sleep 0' 8238 1726882382.27587: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882382.27610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882382.27630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882382.27745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882382.27775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882382.27791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882382.27912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882382.29933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882382.29938: stdout chunk (state=3): >>><<< 8238 1726882382.29940: stderr chunk (state=3): >>><<< 8238 1726882382.30155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882382.30159: _low_level_execute_command(): starting 8238 1726882382.30162: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/AnsiballZ_command.py && sleep 0' 8238 1726882382.31451: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882382.31758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882382.31764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882382.31766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882383.74804: stdout chunk (state=3): >>> <<< 8238 1726882383.74847: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 685 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 685 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:02.479050", "end": "2024-09-20 21:33:03.746146", "delta": "0:00:01.267096", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882383.76559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882383.76563: stdout chunk (state=3): >>><<< 8238 1726882383.76566: stderr chunk (state=3): >>><<< 8238 1726882383.76599: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 685 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 685 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:02.479050", "end": "2024-09-20 21:33:03.746146", "delta": "0:00:01.267096", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882383.76734: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882383.76738: _low_level_execute_command(): starting 8238 1726882383.76741: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882382.0602531-8758-154066226739363/ > /dev/null 2>&1 && sleep 0' 8238 1726882383.77513: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882383.77610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882383.77684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882383.79615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882383.79668: stderr chunk (state=3): >>><<< 8238 1726882383.79672: stdout chunk (state=3): >>><<< 8238 1726882383.79688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882383.79697: handler run complete 8238 1726882383.79718: Evaluated conditional (False): False 8238 1726882383.79728: attempt loop complete, returning result 8238 1726882383.79731: _execute() done 8238 1726882383.79734: dumping result to json 8238 1726882383.79741: done dumping result, returning 8238 1726882383.79749: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [0affc7ec-ae25-54bc-d334-000000000012] 8238 1726882383.79756: sending task result for task 0affc7ec-ae25-54bc-d334-000000000012 8238 1726882383.79866: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000012 8238 1726882383.79869: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.267096", "end": "2024-09-20 21:33:03.746146", "rc": 0, "start": "2024-09-20 21:33:02.479050" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 685 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 685 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + grep -q 'inet [1-9]' + ip addr show testbr + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 8238 1726882383.79975: no more pending results, returning what we have 8238 1726882383.79985: results queue empty 8238 1726882383.79986: checking for any_errors_fatal 8238 1726882383.79995: done checking for any_errors_fatal 8238 1726882383.79996: checking for max_fail_percentage 8238 1726882383.79997: done checking for max_fail_percentage 8238 1726882383.79999: checking to see if all hosts have failed and the running result is not ok 8238 1726882383.79999: done checking to see if all hosts have failed 8238 1726882383.80000: getting the remaining hosts for this loop 8238 1726882383.80002: done getting the remaining hosts for this loop 8238 1726882383.80007: getting the next task for host managed_node3 8238 1726882383.80016: done getting next task for host managed_node3 8238 1726882383.80019: ^ task is: TASK: Include the task 'get_interface_stat.yml' 8238 1726882383.80024: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882383.80028: getting variables 8238 1726882383.80029: in VariableManager get_vars() 8238 1726882383.80070: Calling all_inventory to load vars for managed_node3 8238 1726882383.80073: Calling groups_inventory to load vars for managed_node3 8238 1726882383.80075: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882383.80085: Calling all_plugins_play to load vars for managed_node3 8238 1726882383.80094: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882383.80098: Calling groups_plugins_play to load vars for managed_node3 8238 1726882383.80289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882383.80501: done with get_vars() 8238 1726882383.80515: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:03 -0400 (0:00:01.817) 0:00:13.961 ****** 8238 1726882383.80632: entering _queue_task() for managed_node3/include_tasks 8238 1726882383.81137: worker is 1 (out of 1 available) 8238 1726882383.81148: exiting _queue_task() for managed_node3/include_tasks 8238 1726882383.81160: done queuing things up, now waiting for results queue to drain 8238 1726882383.81161: waiting for pending results... 8238 1726882383.81283: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 8238 1726882383.81388: in run() - task 0affc7ec-ae25-54bc-d334-000000000016 8238 1726882383.81394: variable 'ansible_search_path' from source: unknown 8238 1726882383.81396: variable 'ansible_search_path' from source: unknown 8238 1726882383.81432: calling self._execute() 8238 1726882383.81507: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882383.81515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882383.81523: variable 'omit' from source: magic vars 8238 1726882383.81816: variable 'ansible_distribution_major_version' from source: facts 8238 1726882383.81832: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882383.81835: _execute() done 8238 1726882383.81838: dumping result to json 8238 1726882383.81844: done dumping result, returning 8238 1726882383.81850: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affc7ec-ae25-54bc-d334-000000000016] 8238 1726882383.81858: sending task result for task 0affc7ec-ae25-54bc-d334-000000000016 8238 1726882383.81956: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000016 8238 1726882383.81959: WORKER PROCESS EXITING 8238 1726882383.81990: no more pending results, returning what we have 8238 1726882383.81996: in VariableManager get_vars() 8238 1726882383.82043: Calling all_inventory to load vars for managed_node3 8238 1726882383.82047: Calling groups_inventory to load vars for managed_node3 8238 1726882383.82049: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882383.82059: Calling all_plugins_play to load vars for managed_node3 8238 1726882383.82062: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882383.82064: Calling groups_plugins_play to load vars for managed_node3 8238 1726882383.82249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882383.82369: done with get_vars() 8238 1726882383.82376: variable 'ansible_search_path' from source: unknown 8238 1726882383.82377: variable 'ansible_search_path' from source: unknown 8238 1726882383.82407: we have included files to process 8238 1726882383.82408: generating all_blocks data 8238 1726882383.82409: done generating all_blocks data 8238 1726882383.82410: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8238 1726882383.82410: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8238 1726882383.82412: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8238 1726882383.82580: done processing included file 8238 1726882383.82582: iterating over new_blocks loaded from include file 8238 1726882383.82583: in VariableManager get_vars() 8238 1726882383.82596: done with get_vars() 8238 1726882383.82597: filtering new block on tags 8238 1726882383.82607: done filtering new block on tags 8238 1726882383.82609: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 8238 1726882383.82612: extending task lists for all hosts with included blocks 8238 1726882383.82680: done extending task lists 8238 1726882383.82681: done processing included files 8238 1726882383.82682: results queue empty 8238 1726882383.82682: checking for any_errors_fatal 8238 1726882383.82685: done checking for any_errors_fatal 8238 1726882383.82686: checking for max_fail_percentage 8238 1726882383.82686: done checking for max_fail_percentage 8238 1726882383.82687: checking to see if all hosts have failed and the running result is not ok 8238 1726882383.82687: done checking to see if all hosts have failed 8238 1726882383.82688: getting the remaining hosts for this loop 8238 1726882383.82689: done getting the remaining hosts for this loop 8238 1726882383.82690: getting the next task for host managed_node3 8238 1726882383.82693: done getting next task for host managed_node3 8238 1726882383.82694: ^ task is: TASK: Get stat for interface {{ interface }} 8238 1726882383.82697: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882383.82698: getting variables 8238 1726882383.82699: in VariableManager get_vars() 8238 1726882383.82707: Calling all_inventory to load vars for managed_node3 8238 1726882383.82709: Calling groups_inventory to load vars for managed_node3 8238 1726882383.82710: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882383.82714: Calling all_plugins_play to load vars for managed_node3 8238 1726882383.82715: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882383.82717: Calling groups_plugins_play to load vars for managed_node3 8238 1726882383.82806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882383.82923: done with get_vars() 8238 1726882383.82930: done getting variables 8238 1726882383.83053: variable 'interface' from source: task vars 8238 1726882383.83058: variable 'dhcp_interface1' from source: play vars 8238 1726882383.83104: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:03 -0400 (0:00:00.025) 0:00:13.986 ****** 8238 1726882383.83137: entering _queue_task() for managed_node3/stat 8238 1726882383.83365: worker is 1 (out of 1 available) 8238 1726882383.83385: exiting _queue_task() for managed_node3/stat 8238 1726882383.83398: done queuing things up, now waiting for results queue to drain 8238 1726882383.83400: waiting for pending results... 8238 1726882383.83738: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 8238 1726882383.83744: in run() - task 0affc7ec-ae25-54bc-d334-000000000152 8238 1726882383.83751: variable 'ansible_search_path' from source: unknown 8238 1726882383.83761: variable 'ansible_search_path' from source: unknown 8238 1726882383.83805: calling self._execute() 8238 1726882383.83898: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882383.83909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882383.83924: variable 'omit' from source: magic vars 8238 1726882383.84416: variable 'ansible_distribution_major_version' from source: facts 8238 1726882383.84438: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882383.84451: variable 'omit' from source: magic vars 8238 1726882383.84520: variable 'omit' from source: magic vars 8238 1726882383.84693: variable 'interface' from source: task vars 8238 1726882383.84696: variable 'dhcp_interface1' from source: play vars 8238 1726882383.84715: variable 'dhcp_interface1' from source: play vars 8238 1726882383.84741: variable 'omit' from source: magic vars 8238 1726882383.84792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882383.84843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882383.84873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882383.84895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882383.84917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882383.84960: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882383.84971: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882383.85018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882383.85099: Set connection var ansible_connection to ssh 8238 1726882383.85112: Set connection var ansible_shell_type to sh 8238 1726882383.85118: Set connection var ansible_pipelining to False 8238 1726882383.85127: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882383.85137: Set connection var ansible_timeout to 10 8238 1726882383.85142: Set connection var ansible_shell_executable to /bin/sh 8238 1726882383.85163: variable 'ansible_shell_executable' from source: unknown 8238 1726882383.85166: variable 'ansible_connection' from source: unknown 8238 1726882383.85168: variable 'ansible_module_compression' from source: unknown 8238 1726882383.85171: variable 'ansible_shell_type' from source: unknown 8238 1726882383.85173: variable 'ansible_shell_executable' from source: unknown 8238 1726882383.85175: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882383.85180: variable 'ansible_pipelining' from source: unknown 8238 1726882383.85183: variable 'ansible_timeout' from source: unknown 8238 1726882383.85186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882383.85367: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882383.85376: variable 'omit' from source: magic vars 8238 1726882383.85382: starting attempt loop 8238 1726882383.85385: running the handler 8238 1726882383.85398: _low_level_execute_command(): starting 8238 1726882383.85405: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882383.85912: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882383.85946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882383.85952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882383.85956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882383.85959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882383.86010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882383.86016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882383.86018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882383.86102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882383.87812: stdout chunk (state=3): >>>/root <<< 8238 1726882383.88106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882383.88110: stdout chunk (state=3): >>><<< 8238 1726882383.88113: stderr chunk (state=3): >>><<< 8238 1726882383.88118: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882383.88120: _low_level_execute_command(): starting 8238 1726882383.88126: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552 `" && echo ansible-tmp-1726882383.8800943-8823-51676353952552="` echo /root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552 `" ) && sleep 0' 8238 1726882383.88595: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882383.88610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882383.88631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882383.88678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882383.88696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882383.88792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882383.90753: stdout chunk (state=3): >>>ansible-tmp-1726882383.8800943-8823-51676353952552=/root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552 <<< 8238 1726882383.90958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882383.90962: stdout chunk (state=3): >>><<< 8238 1726882383.90964: stderr chunk (state=3): >>><<< 8238 1726882383.91033: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882383.8800943-8823-51676353952552=/root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882383.91072: variable 'ansible_module_compression' from source: unknown 8238 1726882383.91176: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8238 1726882383.91211: variable 'ansible_facts' from source: unknown 8238 1726882383.91280: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/AnsiballZ_stat.py 8238 1726882383.91432: Sending initial data 8238 1726882383.91436: Sent initial data (150 bytes) 8238 1726882383.91824: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882383.91861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882383.91864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882383.91867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882383.91869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882383.91871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882383.91920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882383.91926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882383.92007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882383.93601: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882383.93682: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882383.93768: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmprmjwoxz8 /root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/AnsiballZ_stat.py <<< 8238 1726882383.93772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/AnsiballZ_stat.py" <<< 8238 1726882383.93851: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmprmjwoxz8" to remote "/root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/AnsiballZ_stat.py" <<< 8238 1726882383.97261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882383.97330: stderr chunk (state=3): >>><<< 8238 1726882383.97334: stdout chunk (state=3): >>><<< 8238 1726882383.97355: done transferring module to remote 8238 1726882383.97365: _low_level_execute_command(): starting 8238 1726882383.97369: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/ /root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/AnsiballZ_stat.py && sleep 0' 8238 1726882383.97819: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882383.97836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882383.97839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882383.97869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882383.97872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882383.97875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882383.97877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882383.97928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882383.97931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882383.98026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882383.99866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882383.99920: stderr chunk (state=3): >>><<< 8238 1726882383.99925: stdout chunk (state=3): >>><<< 8238 1726882383.99940: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882383.99943: _low_level_execute_command(): starting 8238 1726882383.99948: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/AnsiballZ_stat.py && sleep 0' 8238 1726882384.00391: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882384.00394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882384.00429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882384.00432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882384.00435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882384.00437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882384.00497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882384.00500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882384.00503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.00597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.17201: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35002, "dev": 23, "nlink": 1, "atime": 1726882382.4857733, "mtime": 1726882382.4857733, "ctime": 1726882382.4857733, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8238 1726882384.18830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882384.18836: stdout chunk (state=3): >>><<< 8238 1726882384.18839: stderr chunk (state=3): >>><<< 8238 1726882384.18841: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35002, "dev": 23, "nlink": 1, "atime": 1726882382.4857733, "mtime": 1726882382.4857733, "ctime": 1726882382.4857733, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882384.18845: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882384.18847: _low_level_execute_command(): starting 8238 1726882384.18854: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882383.8800943-8823-51676353952552/ > /dev/null 2>&1 && sleep 0' 8238 1726882384.19501: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882384.19603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882384.19626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.19743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.21724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882384.21728: stdout chunk (state=3): >>><<< 8238 1726882384.21734: stderr chunk (state=3): >>><<< 8238 1726882384.21766: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882384.21772: handler run complete 8238 1726882384.21830: attempt loop complete, returning result 8238 1726882384.21833: _execute() done 8238 1726882384.21835: dumping result to json 8238 1726882384.21843: done dumping result, returning 8238 1726882384.21861: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [0affc7ec-ae25-54bc-d334-000000000152] 8238 1726882384.21867: sending task result for task 0affc7ec-ae25-54bc-d334-000000000152 8238 1726882384.22099: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000152 8238 1726882384.22102: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882382.4857733, "block_size": 4096, "blocks": 0, "ctime": 1726882382.4857733, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35002, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882382.4857733, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8238 1726882384.22301: no more pending results, returning what we have 8238 1726882384.22304: results queue empty 8238 1726882384.22305: checking for any_errors_fatal 8238 1726882384.22307: done checking for any_errors_fatal 8238 1726882384.22308: checking for max_fail_percentage 8238 1726882384.22309: done checking for max_fail_percentage 8238 1726882384.22310: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.22311: done checking to see if all hosts have failed 8238 1726882384.22312: getting the remaining hosts for this loop 8238 1726882384.22313: done getting the remaining hosts for this loop 8238 1726882384.22317: getting the next task for host managed_node3 8238 1726882384.22451: done getting next task for host managed_node3 8238 1726882384.22454: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 8238 1726882384.22457: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.22461: getting variables 8238 1726882384.22463: in VariableManager get_vars() 8238 1726882384.22492: Calling all_inventory to load vars for managed_node3 8238 1726882384.22495: Calling groups_inventory to load vars for managed_node3 8238 1726882384.22497: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.22507: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.22510: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.22513: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.22694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.22920: done with get_vars() 8238 1726882384.22933: done getting variables 8238 1726882384.23039: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 8238 1726882384.23171: variable 'interface' from source: task vars 8238 1726882384.23175: variable 'dhcp_interface1' from source: play vars 8238 1726882384.23246: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:04 -0400 (0:00:00.401) 0:00:14.387 ****** 8238 1726882384.23281: entering _queue_task() for managed_node3/assert 8238 1726882384.23283: Creating lock for assert 8238 1726882384.23596: worker is 1 (out of 1 available) 8238 1726882384.23609: exiting _queue_task() for managed_node3/assert 8238 1726882384.23726: done queuing things up, now waiting for results queue to drain 8238 1726882384.23729: waiting for pending results... 8238 1726882384.23917: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 8238 1726882384.24042: in run() - task 0affc7ec-ae25-54bc-d334-000000000017 8238 1726882384.24069: variable 'ansible_search_path' from source: unknown 8238 1726882384.24083: variable 'ansible_search_path' from source: unknown 8238 1726882384.24127: calling self._execute() 8238 1726882384.24303: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.24307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.24309: variable 'omit' from source: magic vars 8238 1726882384.24670: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.24689: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.24700: variable 'omit' from source: magic vars 8238 1726882384.24771: variable 'omit' from source: magic vars 8238 1726882384.24888: variable 'interface' from source: task vars 8238 1726882384.24899: variable 'dhcp_interface1' from source: play vars 8238 1726882384.24982: variable 'dhcp_interface1' from source: play vars 8238 1726882384.25006: variable 'omit' from source: magic vars 8238 1726882384.25066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882384.25176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882384.25181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882384.25184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.25186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.25227: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882384.25284: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.25288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.25377: Set connection var ansible_connection to ssh 8238 1726882384.25391: Set connection var ansible_shell_type to sh 8238 1726882384.25409: Set connection var ansible_pipelining to False 8238 1726882384.25424: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882384.25436: Set connection var ansible_timeout to 10 8238 1726882384.25452: Set connection var ansible_shell_executable to /bin/sh 8238 1726882384.25503: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.25506: variable 'ansible_connection' from source: unknown 8238 1726882384.25508: variable 'ansible_module_compression' from source: unknown 8238 1726882384.25510: variable 'ansible_shell_type' from source: unknown 8238 1726882384.25608: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.25613: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.25615: variable 'ansible_pipelining' from source: unknown 8238 1726882384.25618: variable 'ansible_timeout' from source: unknown 8238 1726882384.25621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.25719: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882384.25747: variable 'omit' from source: magic vars 8238 1726882384.25760: starting attempt loop 8238 1726882384.25767: running the handler 8238 1726882384.25931: variable 'interface_stat' from source: set_fact 8238 1726882384.25970: Evaluated conditional (interface_stat.stat.exists): True 8238 1726882384.26050: handler run complete 8238 1726882384.26053: attempt loop complete, returning result 8238 1726882384.26055: _execute() done 8238 1726882384.26057: dumping result to json 8238 1726882384.26061: done dumping result, returning 8238 1726882384.26063: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [0affc7ec-ae25-54bc-d334-000000000017] 8238 1726882384.26064: sending task result for task 0affc7ec-ae25-54bc-d334-000000000017 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882384.26446: no more pending results, returning what we have 8238 1726882384.26452: results queue empty 8238 1726882384.26453: checking for any_errors_fatal 8238 1726882384.26460: done checking for any_errors_fatal 8238 1726882384.26461: checking for max_fail_percentage 8238 1726882384.26462: done checking for max_fail_percentage 8238 1726882384.26463: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.26464: done checking to see if all hosts have failed 8238 1726882384.26465: getting the remaining hosts for this loop 8238 1726882384.26466: done getting the remaining hosts for this loop 8238 1726882384.26469: getting the next task for host managed_node3 8238 1726882384.26477: done getting next task for host managed_node3 8238 1726882384.26480: ^ task is: TASK: Include the task 'get_interface_stat.yml' 8238 1726882384.26483: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.26487: getting variables 8238 1726882384.26488: in VariableManager get_vars() 8238 1726882384.26528: Calling all_inventory to load vars for managed_node3 8238 1726882384.26530: Calling groups_inventory to load vars for managed_node3 8238 1726882384.26533: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.26543: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.26545: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.26551: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.26761: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000017 8238 1726882384.26764: WORKER PROCESS EXITING 8238 1726882384.26790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.27018: done with get_vars() 8238 1726882384.27032: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:04 -0400 (0:00:00.038) 0:00:14.426 ****** 8238 1726882384.27135: entering _queue_task() for managed_node3/include_tasks 8238 1726882384.27456: worker is 1 (out of 1 available) 8238 1726882384.27469: exiting _queue_task() for managed_node3/include_tasks 8238 1726882384.27483: done queuing things up, now waiting for results queue to drain 8238 1726882384.27485: waiting for pending results... 8238 1726882384.27725: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 8238 1726882384.27855: in run() - task 0affc7ec-ae25-54bc-d334-00000000001b 8238 1726882384.27873: variable 'ansible_search_path' from source: unknown 8238 1726882384.27880: variable 'ansible_search_path' from source: unknown 8238 1726882384.27956: calling self._execute() 8238 1726882384.28014: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.28028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.28043: variable 'omit' from source: magic vars 8238 1726882384.28444: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.28497: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.28501: _execute() done 8238 1726882384.28503: dumping result to json 8238 1726882384.28506: done dumping result, returning 8238 1726882384.28508: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affc7ec-ae25-54bc-d334-00000000001b] 8238 1726882384.28510: sending task result for task 0affc7ec-ae25-54bc-d334-00000000001b 8238 1726882384.28770: no more pending results, returning what we have 8238 1726882384.28774: in VariableManager get_vars() 8238 1726882384.28820: Calling all_inventory to load vars for managed_node3 8238 1726882384.28825: Calling groups_inventory to load vars for managed_node3 8238 1726882384.28828: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.28842: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.28845: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.28848: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.29121: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000001b 8238 1726882384.29126: WORKER PROCESS EXITING 8238 1726882384.29157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.29376: done with get_vars() 8238 1726882384.29384: variable 'ansible_search_path' from source: unknown 8238 1726882384.29385: variable 'ansible_search_path' from source: unknown 8238 1726882384.29426: we have included files to process 8238 1726882384.29427: generating all_blocks data 8238 1726882384.29428: done generating all_blocks data 8238 1726882384.29433: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8238 1726882384.29434: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8238 1726882384.29437: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8238 1726882384.29639: done processing included file 8238 1726882384.29641: iterating over new_blocks loaded from include file 8238 1726882384.29643: in VariableManager get_vars() 8238 1726882384.29665: done with get_vars() 8238 1726882384.29667: filtering new block on tags 8238 1726882384.29683: done filtering new block on tags 8238 1726882384.29690: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 8238 1726882384.29695: extending task lists for all hosts with included blocks 8238 1726882384.29810: done extending task lists 8238 1726882384.29811: done processing included files 8238 1726882384.29812: results queue empty 8238 1726882384.29813: checking for any_errors_fatal 8238 1726882384.29816: done checking for any_errors_fatal 8238 1726882384.29817: checking for max_fail_percentage 8238 1726882384.29818: done checking for max_fail_percentage 8238 1726882384.29819: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.29820: done checking to see if all hosts have failed 8238 1726882384.29821: getting the remaining hosts for this loop 8238 1726882384.29824: done getting the remaining hosts for this loop 8238 1726882384.29826: getting the next task for host managed_node3 8238 1726882384.29831: done getting next task for host managed_node3 8238 1726882384.29833: ^ task is: TASK: Get stat for interface {{ interface }} 8238 1726882384.29836: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.29839: getting variables 8238 1726882384.29840: in VariableManager get_vars() 8238 1726882384.29855: Calling all_inventory to load vars for managed_node3 8238 1726882384.29858: Calling groups_inventory to load vars for managed_node3 8238 1726882384.29860: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.29865: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.29868: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.29871: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.30059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.30266: done with get_vars() 8238 1726882384.30275: done getting variables 8238 1726882384.30438: variable 'interface' from source: task vars 8238 1726882384.30443: variable 'dhcp_interface2' from source: play vars 8238 1726882384.30511: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:04 -0400 (0:00:00.034) 0:00:14.460 ****** 8238 1726882384.30541: entering _queue_task() for managed_node3/stat 8238 1726882384.30868: worker is 1 (out of 1 available) 8238 1726882384.30882: exiting _queue_task() for managed_node3/stat 8238 1726882384.31010: done queuing things up, now waiting for results queue to drain 8238 1726882384.31012: waiting for pending results... 8238 1726882384.31173: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 8238 1726882384.31318: in run() - task 0affc7ec-ae25-54bc-d334-00000000016a 8238 1726882384.31347: variable 'ansible_search_path' from source: unknown 8238 1726882384.31429: variable 'ansible_search_path' from source: unknown 8238 1726882384.31437: calling self._execute() 8238 1726882384.31514: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.31529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.31547: variable 'omit' from source: magic vars 8238 1726882384.31974: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.31998: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.32014: variable 'omit' from source: magic vars 8238 1726882384.32082: variable 'omit' from source: magic vars 8238 1726882384.32204: variable 'interface' from source: task vars 8238 1726882384.32216: variable 'dhcp_interface2' from source: play vars 8238 1726882384.32313: variable 'dhcp_interface2' from source: play vars 8238 1726882384.32326: variable 'omit' from source: magic vars 8238 1726882384.32386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882384.32451: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882384.32529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882384.32532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.32534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.32553: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882384.32562: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.32568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.32680: Set connection var ansible_connection to ssh 8238 1726882384.32687: Set connection var ansible_shell_type to sh 8238 1726882384.32695: Set connection var ansible_pipelining to False 8238 1726882384.32750: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882384.32754: Set connection var ansible_timeout to 10 8238 1726882384.32756: Set connection var ansible_shell_executable to /bin/sh 8238 1726882384.32759: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.32761: variable 'ansible_connection' from source: unknown 8238 1726882384.32766: variable 'ansible_module_compression' from source: unknown 8238 1726882384.32778: variable 'ansible_shell_type' from source: unknown 8238 1726882384.32784: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.32791: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.32798: variable 'ansible_pipelining' from source: unknown 8238 1726882384.32806: variable 'ansible_timeout' from source: unknown 8238 1726882384.32814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.33076: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882384.33080: variable 'omit' from source: magic vars 8238 1726882384.33082: starting attempt loop 8238 1726882384.33085: running the handler 8238 1726882384.33100: _low_level_execute_command(): starting 8238 1726882384.33113: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882384.33923: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882384.33940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882384.33961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882384.33988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882384.34072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882384.34114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882384.34143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882384.34163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.34292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.35996: stdout chunk (state=3): >>>/root <<< 8238 1726882384.36448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882384.36451: stdout chunk (state=3): >>><<< 8238 1726882384.36454: stderr chunk (state=3): >>><<< 8238 1726882384.36457: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882384.36459: _low_level_execute_command(): starting 8238 1726882384.36461: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895 `" && echo ansible-tmp-1726882384.3633854-8841-30467916083895="` echo /root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895 `" ) && sleep 0' 8238 1726882384.37005: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882384.37113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882384.37142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.37261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.39332: stdout chunk (state=3): >>>ansible-tmp-1726882384.3633854-8841-30467916083895=/root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895 <<< 8238 1726882384.39454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882384.39551: stderr chunk (state=3): >>><<< 8238 1726882384.39555: stdout chunk (state=3): >>><<< 8238 1726882384.39663: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882384.3633854-8841-30467916083895=/root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882384.39667: variable 'ansible_module_compression' from source: unknown 8238 1726882384.39802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8238 1726882384.40129: variable 'ansible_facts' from source: unknown 8238 1726882384.40286: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/AnsiballZ_stat.py 8238 1726882384.40691: Sending initial data 8238 1726882384.40735: Sent initial data (150 bytes) 8238 1726882384.41835: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882384.41886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882384.41898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882384.41910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882384.42032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882384.42103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882384.42139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.42241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.43869: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8238 1726882384.43906: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882384.44012: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882384.44087: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpo9w4_4fl /root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/AnsiballZ_stat.py <<< 8238 1726882384.44102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/AnsiballZ_stat.py" <<< 8238 1726882384.44231: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpo9w4_4fl" to remote "/root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/AnsiballZ_stat.py" <<< 8238 1726882384.45490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882384.45494: stdout chunk (state=3): >>><<< 8238 1726882384.45496: stderr chunk (state=3): >>><<< 8238 1726882384.45498: done transferring module to remote 8238 1726882384.45500: _low_level_execute_command(): starting 8238 1726882384.45502: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/ /root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/AnsiballZ_stat.py && sleep 0' 8238 1726882384.46128: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882384.46132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882384.46145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882384.46177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882384.46187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882384.46272: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882384.46300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.46420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.48360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882384.48494: stdout chunk (state=3): >>><<< 8238 1726882384.48497: stderr chunk (state=3): >>><<< 8238 1726882384.48500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882384.48503: _low_level_execute_command(): starting 8238 1726882384.48505: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/AnsiballZ_stat.py && sleep 0' 8238 1726882384.49130: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882384.49149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882384.49173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882384.49288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882384.49309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.49438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.68321: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35408, "dev": 23, "nlink": 1, "atime": 1726882382.492562, "mtime": 1726882382.492562, "ctime": 1726882382.492562, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8238 1726882384.69666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882384.69718: stderr chunk (state=3): >>><<< 8238 1726882384.69723: stdout chunk (state=3): >>><<< 8238 1726882384.69740: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35408, "dev": 23, "nlink": 1, "atime": 1726882382.492562, "mtime": 1726882382.492562, "ctime": 1726882382.492562, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882384.69781: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882384.69789: _low_level_execute_command(): starting 8238 1726882384.69794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882384.3633854-8841-30467916083895/ > /dev/null 2>&1 && sleep 0' 8238 1726882384.70249: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882384.70253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882384.70262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882384.70264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882384.70267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882384.70309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882384.70313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.70404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.72346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882384.72395: stderr chunk (state=3): >>><<< 8238 1726882384.72398: stdout chunk (state=3): >>><<< 8238 1726882384.72415: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882384.72421: handler run complete 8238 1726882384.72461: attempt loop complete, returning result 8238 1726882384.72464: _execute() done 8238 1726882384.72467: dumping result to json 8238 1726882384.72475: done dumping result, returning 8238 1726882384.72484: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [0affc7ec-ae25-54bc-d334-00000000016a] 8238 1726882384.72489: sending task result for task 0affc7ec-ae25-54bc-d334-00000000016a 8238 1726882384.72597: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000016a 8238 1726882384.72600: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882382.492562, "block_size": 4096, "blocks": 0, "ctime": 1726882382.492562, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35408, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882382.492562, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8238 1726882384.72698: no more pending results, returning what we have 8238 1726882384.72701: results queue empty 8238 1726882384.72702: checking for any_errors_fatal 8238 1726882384.72704: done checking for any_errors_fatal 8238 1726882384.72704: checking for max_fail_percentage 8238 1726882384.72705: done checking for max_fail_percentage 8238 1726882384.72706: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.72707: done checking to see if all hosts have failed 8238 1726882384.72708: getting the remaining hosts for this loop 8238 1726882384.72711: done getting the remaining hosts for this loop 8238 1726882384.72715: getting the next task for host managed_node3 8238 1726882384.72725: done getting next task for host managed_node3 8238 1726882384.72727: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 8238 1726882384.72729: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.72733: getting variables 8238 1726882384.72734: in VariableManager get_vars() 8238 1726882384.72769: Calling all_inventory to load vars for managed_node3 8238 1726882384.72772: Calling groups_inventory to load vars for managed_node3 8238 1726882384.72774: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.72784: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.72787: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.72789: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.72930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.73060: done with get_vars() 8238 1726882384.73068: done getting variables 8238 1726882384.73112: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882384.73205: variable 'interface' from source: task vars 8238 1726882384.73208: variable 'dhcp_interface2' from source: play vars 8238 1726882384.73252: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:04 -0400 (0:00:00.427) 0:00:14.887 ****** 8238 1726882384.73278: entering _queue_task() for managed_node3/assert 8238 1726882384.73476: worker is 1 (out of 1 available) 8238 1726882384.73488: exiting _queue_task() for managed_node3/assert 8238 1726882384.73500: done queuing things up, now waiting for results queue to drain 8238 1726882384.73502: waiting for pending results... 8238 1726882384.73668: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 8238 1726882384.73741: in run() - task 0affc7ec-ae25-54bc-d334-00000000001c 8238 1726882384.73757: variable 'ansible_search_path' from source: unknown 8238 1726882384.73761: variable 'ansible_search_path' from source: unknown 8238 1726882384.73788: calling self._execute() 8238 1726882384.74227: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.74233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.74237: variable 'omit' from source: magic vars 8238 1726882384.74491: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.74509: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.74519: variable 'omit' from source: magic vars 8238 1726882384.74576: variable 'omit' from source: magic vars 8238 1726882384.74677: variable 'interface' from source: task vars 8238 1726882384.74687: variable 'dhcp_interface2' from source: play vars 8238 1726882384.74758: variable 'dhcp_interface2' from source: play vars 8238 1726882384.74782: variable 'omit' from source: magic vars 8238 1726882384.74873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882384.74877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882384.74898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882384.74924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.74940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.74985: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882384.74993: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.75002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.75118: Set connection var ansible_connection to ssh 8238 1726882384.75130: Set connection var ansible_shell_type to sh 8238 1726882384.75169: Set connection var ansible_pipelining to False 8238 1726882384.75172: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882384.75174: Set connection var ansible_timeout to 10 8238 1726882384.75177: Set connection var ansible_shell_executable to /bin/sh 8238 1726882384.75200: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.75227: variable 'ansible_connection' from source: unknown 8238 1726882384.75230: variable 'ansible_module_compression' from source: unknown 8238 1726882384.75233: variable 'ansible_shell_type' from source: unknown 8238 1726882384.75235: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.75237: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.75244: variable 'ansible_pipelining' from source: unknown 8238 1726882384.75247: variable 'ansible_timeout' from source: unknown 8238 1726882384.75254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.75368: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882384.75376: variable 'omit' from source: magic vars 8238 1726882384.75381: starting attempt loop 8238 1726882384.75384: running the handler 8238 1726882384.75480: variable 'interface_stat' from source: set_fact 8238 1726882384.75494: Evaluated conditional (interface_stat.stat.exists): True 8238 1726882384.75500: handler run complete 8238 1726882384.75511: attempt loop complete, returning result 8238 1726882384.75514: _execute() done 8238 1726882384.75526: dumping result to json 8238 1726882384.75531: done dumping result, returning 8238 1726882384.75534: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [0affc7ec-ae25-54bc-d334-00000000001c] 8238 1726882384.75539: sending task result for task 0affc7ec-ae25-54bc-d334-00000000001c ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882384.75672: no more pending results, returning what we have 8238 1726882384.75676: results queue empty 8238 1726882384.75676: checking for any_errors_fatal 8238 1726882384.75682: done checking for any_errors_fatal 8238 1726882384.75683: checking for max_fail_percentage 8238 1726882384.75684: done checking for max_fail_percentage 8238 1726882384.75685: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.75686: done checking to see if all hosts have failed 8238 1726882384.75687: getting the remaining hosts for this loop 8238 1726882384.75689: done getting the remaining hosts for this loop 8238 1726882384.75692: getting the next task for host managed_node3 8238 1726882384.75699: done getting next task for host managed_node3 8238 1726882384.75702: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 8238 1726882384.75703: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.75706: getting variables 8238 1726882384.75707: in VariableManager get_vars() 8238 1726882384.75939: Calling all_inventory to load vars for managed_node3 8238 1726882384.75944: Calling groups_inventory to load vars for managed_node3 8238 1726882384.75947: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.75956: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.75958: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.75961: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.76059: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000001c 8238 1726882384.76063: WORKER PROCESS EXITING 8238 1726882384.76070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.76185: done with get_vars() 8238 1726882384.76191: done getting variables 8238 1726882384.76235: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Friday 20 September 2024 21:33:04 -0400 (0:00:00.029) 0:00:14.917 ****** 8238 1726882384.76254: entering _queue_task() for managed_node3/command 8238 1726882384.76437: worker is 1 (out of 1 available) 8238 1726882384.76451: exiting _queue_task() for managed_node3/command 8238 1726882384.76463: done queuing things up, now waiting for results queue to drain 8238 1726882384.76465: waiting for pending results... 8238 1726882384.76623: running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript 8238 1726882384.76685: in run() - task 0affc7ec-ae25-54bc-d334-00000000001d 8238 1726882384.76695: variable 'ansible_search_path' from source: unknown 8238 1726882384.76728: calling self._execute() 8238 1726882384.76797: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.76801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.76809: variable 'omit' from source: magic vars 8238 1726882384.77128: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.77132: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.77169: variable 'network_provider' from source: set_fact 8238 1726882384.77173: Evaluated conditional (network_provider == "initscripts"): False 8238 1726882384.77177: when evaluation is False, skipping this task 8238 1726882384.77180: _execute() done 8238 1726882384.77183: dumping result to json 8238 1726882384.77185: done dumping result, returning 8238 1726882384.77250: done running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript [0affc7ec-ae25-54bc-d334-00000000001d] 8238 1726882384.77254: sending task result for task 0affc7ec-ae25-54bc-d334-00000000001d 8238 1726882384.77330: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000001d 8238 1726882384.77334: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8238 1726882384.77381: no more pending results, returning what we have 8238 1726882384.77384: results queue empty 8238 1726882384.77385: checking for any_errors_fatal 8238 1726882384.77391: done checking for any_errors_fatal 8238 1726882384.77392: checking for max_fail_percentage 8238 1726882384.77393: done checking for max_fail_percentage 8238 1726882384.77394: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.77395: done checking to see if all hosts have failed 8238 1726882384.77396: getting the remaining hosts for this loop 8238 1726882384.77397: done getting the remaining hosts for this loop 8238 1726882384.77400: getting the next task for host managed_node3 8238 1726882384.77404: done getting next task for host managed_node3 8238 1726882384.77405: ^ task is: TASK: TEST Add Bond with 2 ports 8238 1726882384.77407: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.77409: getting variables 8238 1726882384.77410: in VariableManager get_vars() 8238 1726882384.77440: Calling all_inventory to load vars for managed_node3 8238 1726882384.77442: Calling groups_inventory to load vars for managed_node3 8238 1726882384.77444: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.77451: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.77453: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.77455: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.77562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.77693: done with get_vars() 8238 1726882384.77700: done getting variables 8238 1726882384.77742: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Friday 20 September 2024 21:33:04 -0400 (0:00:00.015) 0:00:14.932 ****** 8238 1726882384.77761: entering _queue_task() for managed_node3/debug 8238 1726882384.77934: worker is 1 (out of 1 available) 8238 1726882384.77947: exiting _queue_task() for managed_node3/debug 8238 1726882384.77958: done queuing things up, now waiting for results queue to drain 8238 1726882384.77960: waiting for pending results... 8238 1726882384.78103: running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports 8238 1726882384.78161: in run() - task 0affc7ec-ae25-54bc-d334-00000000001e 8238 1726882384.78172: variable 'ansible_search_path' from source: unknown 8238 1726882384.78199: calling self._execute() 8238 1726882384.78265: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.78269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.78277: variable 'omit' from source: magic vars 8238 1726882384.78529: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.78539: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.78550: variable 'omit' from source: magic vars 8238 1726882384.78564: variable 'omit' from source: magic vars 8238 1726882384.78588: variable 'omit' from source: magic vars 8238 1726882384.78621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882384.78647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882384.78667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882384.78681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.78690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.78713: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882384.78716: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.78719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.78795: Set connection var ansible_connection to ssh 8238 1726882384.78799: Set connection var ansible_shell_type to sh 8238 1726882384.78802: Set connection var ansible_pipelining to False 8238 1726882384.78809: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882384.78815: Set connection var ansible_timeout to 10 8238 1726882384.78823: Set connection var ansible_shell_executable to /bin/sh 8238 1726882384.78840: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.78843: variable 'ansible_connection' from source: unknown 8238 1726882384.78845: variable 'ansible_module_compression' from source: unknown 8238 1726882384.78848: variable 'ansible_shell_type' from source: unknown 8238 1726882384.78853: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.78856: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.78860: variable 'ansible_pipelining' from source: unknown 8238 1726882384.78863: variable 'ansible_timeout' from source: unknown 8238 1726882384.78868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.78972: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882384.78986: variable 'omit' from source: magic vars 8238 1726882384.78989: starting attempt loop 8238 1726882384.78992: running the handler 8238 1726882384.79025: handler run complete 8238 1726882384.79038: attempt loop complete, returning result 8238 1726882384.79040: _execute() done 8238 1726882384.79043: dumping result to json 8238 1726882384.79046: done dumping result, returning 8238 1726882384.79056: done running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports [0affc7ec-ae25-54bc-d334-00000000001e] 8238 1726882384.79061: sending task result for task 0affc7ec-ae25-54bc-d334-00000000001e 8238 1726882384.79151: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000001e 8238 1726882384.79154: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 8238 1726882384.79199: no more pending results, returning what we have 8238 1726882384.79202: results queue empty 8238 1726882384.79204: checking for any_errors_fatal 8238 1726882384.79208: done checking for any_errors_fatal 8238 1726882384.79209: checking for max_fail_percentage 8238 1726882384.79210: done checking for max_fail_percentage 8238 1726882384.79211: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.79212: done checking to see if all hosts have failed 8238 1726882384.79213: getting the remaining hosts for this loop 8238 1726882384.79214: done getting the remaining hosts for this loop 8238 1726882384.79217: getting the next task for host managed_node3 8238 1726882384.79225: done getting next task for host managed_node3 8238 1726882384.79229: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8238 1726882384.79232: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.79252: getting variables 8238 1726882384.79253: in VariableManager get_vars() 8238 1726882384.79281: Calling all_inventory to load vars for managed_node3 8238 1726882384.79282: Calling groups_inventory to load vars for managed_node3 8238 1726882384.79284: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.79290: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.79291: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.79293: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.79400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.79529: done with get_vars() 8238 1726882384.79536: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:04 -0400 (0:00:00.018) 0:00:14.951 ****** 8238 1726882384.79603: entering _queue_task() for managed_node3/include_tasks 8238 1726882384.79779: worker is 1 (out of 1 available) 8238 1726882384.79792: exiting _queue_task() for managed_node3/include_tasks 8238 1726882384.79805: done queuing things up, now waiting for results queue to drain 8238 1726882384.79807: waiting for pending results... 8238 1726882384.79950: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8238 1726882384.80026: in run() - task 0affc7ec-ae25-54bc-d334-000000000026 8238 1726882384.80127: variable 'ansible_search_path' from source: unknown 8238 1726882384.80131: variable 'ansible_search_path' from source: unknown 8238 1726882384.80134: calling self._execute() 8238 1726882384.80136: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.80140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.80142: variable 'omit' from source: magic vars 8238 1726882384.80402: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.80412: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.80417: _execute() done 8238 1726882384.80423: dumping result to json 8238 1726882384.80428: done dumping result, returning 8238 1726882384.80434: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-54bc-d334-000000000026] 8238 1726882384.80439: sending task result for task 0affc7ec-ae25-54bc-d334-000000000026 8238 1726882384.80527: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000026 8238 1726882384.80530: WORKER PROCESS EXITING 8238 1726882384.80571: no more pending results, returning what we have 8238 1726882384.80575: in VariableManager get_vars() 8238 1726882384.80610: Calling all_inventory to load vars for managed_node3 8238 1726882384.80613: Calling groups_inventory to load vars for managed_node3 8238 1726882384.80615: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.80625: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.80628: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.80631: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.80770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.80890: done with get_vars() 8238 1726882384.80895: variable 'ansible_search_path' from source: unknown 8238 1726882384.80896: variable 'ansible_search_path' from source: unknown 8238 1726882384.80924: we have included files to process 8238 1726882384.80925: generating all_blocks data 8238 1726882384.80926: done generating all_blocks data 8238 1726882384.80929: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8238 1726882384.80930: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8238 1726882384.80932: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8238 1726882384.81415: done processing included file 8238 1726882384.81417: iterating over new_blocks loaded from include file 8238 1726882384.81418: in VariableManager get_vars() 8238 1726882384.81434: done with get_vars() 8238 1726882384.81436: filtering new block on tags 8238 1726882384.81447: done filtering new block on tags 8238 1726882384.81451: in VariableManager get_vars() 8238 1726882384.81465: done with get_vars() 8238 1726882384.81466: filtering new block on tags 8238 1726882384.81479: done filtering new block on tags 8238 1726882384.81480: in VariableManager get_vars() 8238 1726882384.81493: done with get_vars() 8238 1726882384.81494: filtering new block on tags 8238 1726882384.81507: done filtering new block on tags 8238 1726882384.81508: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 8238 1726882384.81512: extending task lists for all hosts with included blocks 8238 1726882384.82053: done extending task lists 8238 1726882384.82054: done processing included files 8238 1726882384.82055: results queue empty 8238 1726882384.82055: checking for any_errors_fatal 8238 1726882384.82057: done checking for any_errors_fatal 8238 1726882384.82057: checking for max_fail_percentage 8238 1726882384.82058: done checking for max_fail_percentage 8238 1726882384.82059: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.82059: done checking to see if all hosts have failed 8238 1726882384.82060: getting the remaining hosts for this loop 8238 1726882384.82060: done getting the remaining hosts for this loop 8238 1726882384.82062: getting the next task for host managed_node3 8238 1726882384.82065: done getting next task for host managed_node3 8238 1726882384.82066: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8238 1726882384.82068: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.82074: getting variables 8238 1726882384.82075: in VariableManager get_vars() 8238 1726882384.82085: Calling all_inventory to load vars for managed_node3 8238 1726882384.82086: Calling groups_inventory to load vars for managed_node3 8238 1726882384.82087: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.82091: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.82092: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.82094: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.82195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.82313: done with get_vars() 8238 1726882384.82319: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:04 -0400 (0:00:00.027) 0:00:14.978 ****** 8238 1726882384.82371: entering _queue_task() for managed_node3/setup 8238 1726882384.82541: worker is 1 (out of 1 available) 8238 1726882384.82556: exiting _queue_task() for managed_node3/setup 8238 1726882384.82567: done queuing things up, now waiting for results queue to drain 8238 1726882384.82568: waiting for pending results... 8238 1726882384.82712: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8238 1726882384.82796: in run() - task 0affc7ec-ae25-54bc-d334-000000000188 8238 1726882384.82809: variable 'ansible_search_path' from source: unknown 8238 1726882384.82812: variable 'ansible_search_path' from source: unknown 8238 1726882384.82841: calling self._execute() 8238 1726882384.82904: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.82908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.82916: variable 'omit' from source: magic vars 8238 1726882384.83179: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.83189: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.83344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882384.84904: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882384.84951: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882384.84982: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882384.85020: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882384.85042: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882384.85109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882384.85133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882384.85151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882384.85182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882384.85196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882384.85240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882384.85260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882384.85278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882384.85308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882384.85319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882384.85436: variable '__network_required_facts' from source: role '' defaults 8238 1726882384.85443: variable 'ansible_facts' from source: unknown 8238 1726882384.85501: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8238 1726882384.85505: when evaluation is False, skipping this task 8238 1726882384.85508: _execute() done 8238 1726882384.85512: dumping result to json 8238 1726882384.85514: done dumping result, returning 8238 1726882384.85526: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-54bc-d334-000000000188] 8238 1726882384.85530: sending task result for task 0affc7ec-ae25-54bc-d334-000000000188 8238 1726882384.85613: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000188 8238 1726882384.85616: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882384.85665: no more pending results, returning what we have 8238 1726882384.85668: results queue empty 8238 1726882384.85669: checking for any_errors_fatal 8238 1726882384.85671: done checking for any_errors_fatal 8238 1726882384.85671: checking for max_fail_percentage 8238 1726882384.85673: done checking for max_fail_percentage 8238 1726882384.85674: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.85675: done checking to see if all hosts have failed 8238 1726882384.85675: getting the remaining hosts for this loop 8238 1726882384.85677: done getting the remaining hosts for this loop 8238 1726882384.85680: getting the next task for host managed_node3 8238 1726882384.85690: done getting next task for host managed_node3 8238 1726882384.85694: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 8238 1726882384.85698: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.85711: getting variables 8238 1726882384.85713: in VariableManager get_vars() 8238 1726882384.85752: Calling all_inventory to load vars for managed_node3 8238 1726882384.85755: Calling groups_inventory to load vars for managed_node3 8238 1726882384.85757: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.85765: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.85767: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.85770: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.85907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.86069: done with get_vars() 8238 1726882384.86077: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:04 -0400 (0:00:00.037) 0:00:15.016 ****** 8238 1726882384.86151: entering _queue_task() for managed_node3/stat 8238 1726882384.86339: worker is 1 (out of 1 available) 8238 1726882384.86355: exiting _queue_task() for managed_node3/stat 8238 1726882384.86366: done queuing things up, now waiting for results queue to drain 8238 1726882384.86368: waiting for pending results... 8238 1726882384.86527: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 8238 1726882384.86626: in run() - task 0affc7ec-ae25-54bc-d334-00000000018a 8238 1726882384.86638: variable 'ansible_search_path' from source: unknown 8238 1726882384.86641: variable 'ansible_search_path' from source: unknown 8238 1726882384.86672: calling self._execute() 8238 1726882384.86738: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.86742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.86753: variable 'omit' from source: magic vars 8238 1726882384.87018: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.87030: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.87228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882384.87360: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882384.87391: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882384.87417: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882384.87444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882384.87512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882384.87536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882384.87559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882384.87583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882384.87649: variable '__network_is_ostree' from source: set_fact 8238 1726882384.87657: Evaluated conditional (not __network_is_ostree is defined): False 8238 1726882384.87660: when evaluation is False, skipping this task 8238 1726882384.87663: _execute() done 8238 1726882384.87668: dumping result to json 8238 1726882384.87670: done dumping result, returning 8238 1726882384.87679: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-54bc-d334-00000000018a] 8238 1726882384.87682: sending task result for task 0affc7ec-ae25-54bc-d334-00000000018a 8238 1726882384.87765: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000018a 8238 1726882384.87768: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8238 1726882384.87839: no more pending results, returning what we have 8238 1726882384.87842: results queue empty 8238 1726882384.87843: checking for any_errors_fatal 8238 1726882384.87848: done checking for any_errors_fatal 8238 1726882384.87848: checking for max_fail_percentage 8238 1726882384.87850: done checking for max_fail_percentage 8238 1726882384.87851: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.87852: done checking to see if all hosts have failed 8238 1726882384.87852: getting the remaining hosts for this loop 8238 1726882384.87853: done getting the remaining hosts for this loop 8238 1726882384.87856: getting the next task for host managed_node3 8238 1726882384.87862: done getting next task for host managed_node3 8238 1726882384.87865: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8238 1726882384.87869: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.87881: getting variables 8238 1726882384.87882: in VariableManager get_vars() 8238 1726882384.87915: Calling all_inventory to load vars for managed_node3 8238 1726882384.87917: Calling groups_inventory to load vars for managed_node3 8238 1726882384.87919: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.87929: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.87932: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.87935: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.88052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.88182: done with get_vars() 8238 1726882384.88189: done getting variables 8238 1726882384.88233: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:04 -0400 (0:00:00.021) 0:00:15.037 ****** 8238 1726882384.88257: entering _queue_task() for managed_node3/set_fact 8238 1726882384.88449: worker is 1 (out of 1 available) 8238 1726882384.88461: exiting _queue_task() for managed_node3/set_fact 8238 1726882384.88475: done queuing things up, now waiting for results queue to drain 8238 1726882384.88477: waiting for pending results... 8238 1726882384.88631: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8238 1726882384.88728: in run() - task 0affc7ec-ae25-54bc-d334-00000000018b 8238 1726882384.88741: variable 'ansible_search_path' from source: unknown 8238 1726882384.88746: variable 'ansible_search_path' from source: unknown 8238 1726882384.88776: calling self._execute() 8238 1726882384.88840: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.88844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.88857: variable 'omit' from source: magic vars 8238 1726882384.89127: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.89227: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.89268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882384.89576: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882384.89581: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882384.89584: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882384.89606: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882384.89679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882384.89700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882384.89718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882384.89740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882384.89804: variable '__network_is_ostree' from source: set_fact 8238 1726882384.89809: Evaluated conditional (not __network_is_ostree is defined): False 8238 1726882384.89812: when evaluation is False, skipping this task 8238 1726882384.89814: _execute() done 8238 1726882384.89821: dumping result to json 8238 1726882384.89826: done dumping result, returning 8238 1726882384.89832: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-54bc-d334-00000000018b] 8238 1726882384.89838: sending task result for task 0affc7ec-ae25-54bc-d334-00000000018b 8238 1726882384.89921: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000018b 8238 1726882384.89926: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8238 1726882384.89975: no more pending results, returning what we have 8238 1726882384.89979: results queue empty 8238 1726882384.89980: checking for any_errors_fatal 8238 1726882384.89982: done checking for any_errors_fatal 8238 1726882384.89983: checking for max_fail_percentage 8238 1726882384.89984: done checking for max_fail_percentage 8238 1726882384.89985: checking to see if all hosts have failed and the running result is not ok 8238 1726882384.89986: done checking to see if all hosts have failed 8238 1726882384.89987: getting the remaining hosts for this loop 8238 1726882384.89988: done getting the remaining hosts for this loop 8238 1726882384.89992: getting the next task for host managed_node3 8238 1726882384.90000: done getting next task for host managed_node3 8238 1726882384.90004: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 8238 1726882384.90008: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882384.90021: getting variables 8238 1726882384.90023: in VariableManager get_vars() 8238 1726882384.90060: Calling all_inventory to load vars for managed_node3 8238 1726882384.90063: Calling groups_inventory to load vars for managed_node3 8238 1726882384.90065: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882384.90073: Calling all_plugins_play to load vars for managed_node3 8238 1726882384.90075: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882384.90078: Calling groups_plugins_play to load vars for managed_node3 8238 1726882384.90219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882384.90347: done with get_vars() 8238 1726882384.90356: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:04 -0400 (0:00:00.021) 0:00:15.059 ****** 8238 1726882384.90425: entering _queue_task() for managed_node3/service_facts 8238 1726882384.90427: Creating lock for service_facts 8238 1726882384.90628: worker is 1 (out of 1 available) 8238 1726882384.90642: exiting _queue_task() for managed_node3/service_facts 8238 1726882384.90656: done queuing things up, now waiting for results queue to drain 8238 1726882384.90658: waiting for pending results... 8238 1726882384.90806: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 8238 1726882384.90899: in run() - task 0affc7ec-ae25-54bc-d334-00000000018d 8238 1726882384.90910: variable 'ansible_search_path' from source: unknown 8238 1726882384.90913: variable 'ansible_search_path' from source: unknown 8238 1726882384.90943: calling self._execute() 8238 1726882384.91005: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.91010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.91020: variable 'omit' from source: magic vars 8238 1726882384.91281: variable 'ansible_distribution_major_version' from source: facts 8238 1726882384.91290: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882384.91296: variable 'omit' from source: magic vars 8238 1726882384.91352: variable 'omit' from source: magic vars 8238 1726882384.91375: variable 'omit' from source: magic vars 8238 1726882384.91406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882384.91437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882384.91455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882384.91467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.91477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882384.91502: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882384.91505: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.91507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.91585: Set connection var ansible_connection to ssh 8238 1726882384.91589: Set connection var ansible_shell_type to sh 8238 1726882384.91591: Set connection var ansible_pipelining to False 8238 1726882384.91597: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882384.91603: Set connection var ansible_timeout to 10 8238 1726882384.91611: Set connection var ansible_shell_executable to /bin/sh 8238 1726882384.91629: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.91632: variable 'ansible_connection' from source: unknown 8238 1726882384.91635: variable 'ansible_module_compression' from source: unknown 8238 1726882384.91637: variable 'ansible_shell_type' from source: unknown 8238 1726882384.91639: variable 'ansible_shell_executable' from source: unknown 8238 1726882384.91643: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882384.91646: variable 'ansible_pipelining' from source: unknown 8238 1726882384.91650: variable 'ansible_timeout' from source: unknown 8238 1726882384.91653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882384.91798: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882384.91806: variable 'omit' from source: magic vars 8238 1726882384.91811: starting attempt loop 8238 1726882384.91814: running the handler 8238 1726882384.91827: _low_level_execute_command(): starting 8238 1726882384.91834: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882384.92373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882384.92376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882384.92379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882384.92382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882384.92384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882384.92428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882384.92453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882384.92457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.92539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.94265: stdout chunk (state=3): >>>/root <<< 8238 1726882384.94426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882384.94596: stderr chunk (state=3): >>><<< 8238 1726882384.94599: stdout chunk (state=3): >>><<< 8238 1726882384.94604: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882384.94606: _low_level_execute_command(): starting 8238 1726882384.94610: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787 `" && echo ansible-tmp-1726882384.9453044-8875-51962240513787="` echo /root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787 `" ) && sleep 0' 8238 1726882384.95165: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882384.95181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882384.95205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882384.95308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882384.95341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882384.95358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882384.95472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882384.97464: stdout chunk (state=3): >>>ansible-tmp-1726882384.9453044-8875-51962240513787=/root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787 <<< 8238 1726882384.97667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882384.97670: stdout chunk (state=3): >>><<< 8238 1726882384.97673: stderr chunk (state=3): >>><<< 8238 1726882384.97688: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882384.9453044-8875-51962240513787=/root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882384.97743: variable 'ansible_module_compression' from source: unknown 8238 1726882384.97827: ANSIBALLZ: Using lock for service_facts 8238 1726882384.97830: ANSIBALLZ: Acquiring lock 8238 1726882384.97833: ANSIBALLZ: Lock acquired: 140036200463104 8238 1726882384.97836: ANSIBALLZ: Creating module 8238 1726882385.10975: ANSIBALLZ: Writing module into payload 8238 1726882385.11082: ANSIBALLZ: Writing module 8238 1726882385.11228: ANSIBALLZ: Renaming module 8238 1726882385.11231: ANSIBALLZ: Done creating module 8238 1726882385.11233: variable 'ansible_facts' from source: unknown 8238 1726882385.11236: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/AnsiballZ_service_facts.py 8238 1726882385.11457: Sending initial data 8238 1726882385.11481: Sent initial data (159 bytes) 8238 1726882385.12081: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882385.12105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882385.12229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882385.13924: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882385.14013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882385.14105: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpqo1tesuv /root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/AnsiballZ_service_facts.py <<< 8238 1726882385.14108: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/AnsiballZ_service_facts.py" <<< 8238 1726882385.14201: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpqo1tesuv" to remote "/root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/AnsiballZ_service_facts.py" <<< 8238 1726882385.15182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882385.15286: stderr chunk (state=3): >>><<< 8238 1726882385.15289: stdout chunk (state=3): >>><<< 8238 1726882385.15292: done transferring module to remote 8238 1726882385.15297: _low_level_execute_command(): starting 8238 1726882385.15306: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/ /root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/AnsiballZ_service_facts.py && sleep 0' 8238 1726882385.15929: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882385.15945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882385.15963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882385.16013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882385.16016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882385.16106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882385.18039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882385.18046: stdout chunk (state=3): >>><<< 8238 1726882385.18048: stderr chunk (state=3): >>><<< 8238 1726882385.18140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882385.18185: _low_level_execute_command(): starting 8238 1726882385.18188: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/AnsiballZ_service_facts.py && sleep 0' 8238 1726882385.18636: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882385.18640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882385.18642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882385.18644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882385.18646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882385.18700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882385.18703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882385.18795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882387.34329: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 8238 1726882387.34365: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "sour<<< 8238 1726882387.34389: stdout chunk (state=3): >>>ce": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.ser<<< 8238 1726882387.34406: stdout chunk (state=3): >>>vice": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-init<<< 8238 1726882387.34410: stdout chunk (state=3): >>>ramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inacti<<< 8238 1726882387.34414: stdout chunk (state=3): >>>ve", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 8238 1726882387.35955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882387.36019: stderr chunk (state=3): >>><<< 8238 1726882387.36024: stdout chunk (state=3): >>><<< 8238 1726882387.36048: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882387.37404: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882387.37413: _low_level_execute_command(): starting 8238 1726882387.37418: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882384.9453044-8875-51962240513787/ > /dev/null 2>&1 && sleep 0' 8238 1726882387.37883: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882387.37888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882387.37915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882387.37918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882387.37920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882387.37925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.37986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882387.37990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882387.37992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882387.38082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882387.40005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882387.40060: stderr chunk (state=3): >>><<< 8238 1726882387.40063: stdout chunk (state=3): >>><<< 8238 1726882387.40076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882387.40083: handler run complete 8238 1726882387.40226: variable 'ansible_facts' from source: unknown 8238 1726882387.40346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882387.40677: variable 'ansible_facts' from source: unknown 8238 1726882387.40776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882387.40928: attempt loop complete, returning result 8238 1726882387.40933: _execute() done 8238 1726882387.40937: dumping result to json 8238 1726882387.40985: done dumping result, returning 8238 1726882387.40992: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-54bc-d334-00000000018d] 8238 1726882387.40997: sending task result for task 0affc7ec-ae25-54bc-d334-00000000018d ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882387.41654: no more pending results, returning what we have 8238 1726882387.41658: results queue empty 8238 1726882387.41658: checking for any_errors_fatal 8238 1726882387.41663: done checking for any_errors_fatal 8238 1726882387.41663: checking for max_fail_percentage 8238 1726882387.41665: done checking for max_fail_percentage 8238 1726882387.41666: checking to see if all hosts have failed and the running result is not ok 8238 1726882387.41667: done checking to see if all hosts have failed 8238 1726882387.41667: getting the remaining hosts for this loop 8238 1726882387.41669: done getting the remaining hosts for this loop 8238 1726882387.41672: getting the next task for host managed_node3 8238 1726882387.41678: done getting next task for host managed_node3 8238 1726882387.41681: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 8238 1726882387.41685: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882387.41693: getting variables 8238 1726882387.41695: in VariableManager get_vars() 8238 1726882387.41730: Calling all_inventory to load vars for managed_node3 8238 1726882387.41733: Calling groups_inventory to load vars for managed_node3 8238 1726882387.41736: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882387.41744: Calling all_plugins_play to load vars for managed_node3 8238 1726882387.41746: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882387.41751: Calling groups_plugins_play to load vars for managed_node3 8238 1726882387.42112: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000018d 8238 1726882387.42117: WORKER PROCESS EXITING 8238 1726882387.42130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882387.42468: done with get_vars() 8238 1726882387.42479: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:07 -0400 (0:00:02.521) 0:00:17.580 ****** 8238 1726882387.42564: entering _queue_task() for managed_node3/package_facts 8238 1726882387.42565: Creating lock for package_facts 8238 1726882387.42828: worker is 1 (out of 1 available) 8238 1726882387.42844: exiting _queue_task() for managed_node3/package_facts 8238 1726882387.42858: done queuing things up, now waiting for results queue to drain 8238 1726882387.42861: waiting for pending results... 8238 1726882387.43028: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 8238 1726882387.43132: in run() - task 0affc7ec-ae25-54bc-d334-00000000018e 8238 1726882387.43145: variable 'ansible_search_path' from source: unknown 8238 1726882387.43149: variable 'ansible_search_path' from source: unknown 8238 1726882387.43181: calling self._execute() 8238 1726882387.43252: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882387.43256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882387.43264: variable 'omit' from source: magic vars 8238 1726882387.43556: variable 'ansible_distribution_major_version' from source: facts 8238 1726882387.43565: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882387.43570: variable 'omit' from source: magic vars 8238 1726882387.43624: variable 'omit' from source: magic vars 8238 1726882387.43656: variable 'omit' from source: magic vars 8238 1726882387.43690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882387.43721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882387.43739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882387.43829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882387.43832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882387.43835: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882387.43839: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882387.43843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882387.43875: Set connection var ansible_connection to ssh 8238 1726882387.43878: Set connection var ansible_shell_type to sh 8238 1726882387.43881: Set connection var ansible_pipelining to False 8238 1726882387.43888: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882387.43894: Set connection var ansible_timeout to 10 8238 1726882387.43901: Set connection var ansible_shell_executable to /bin/sh 8238 1726882387.43919: variable 'ansible_shell_executable' from source: unknown 8238 1726882387.43924: variable 'ansible_connection' from source: unknown 8238 1726882387.43927: variable 'ansible_module_compression' from source: unknown 8238 1726882387.43930: variable 'ansible_shell_type' from source: unknown 8238 1726882387.43932: variable 'ansible_shell_executable' from source: unknown 8238 1726882387.43935: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882387.43937: variable 'ansible_pipelining' from source: unknown 8238 1726882387.43940: variable 'ansible_timeout' from source: unknown 8238 1726882387.43945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882387.44111: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882387.44119: variable 'omit' from source: magic vars 8238 1726882387.44125: starting attempt loop 8238 1726882387.44128: running the handler 8238 1726882387.44141: _low_level_execute_command(): starting 8238 1726882387.44148: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882387.44703: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882387.44707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.44711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882387.44713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.44775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882387.44778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882387.44782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882387.44867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882387.46618: stdout chunk (state=3): >>>/root <<< 8238 1726882387.46726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882387.46781: stderr chunk (state=3): >>><<< 8238 1726882387.46785: stdout chunk (state=3): >>><<< 8238 1726882387.46808: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882387.46825: _low_level_execute_command(): starting 8238 1726882387.46831: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182 `" && echo ansible-tmp-1726882387.4680665-8942-50936256699182="` echo /root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182 `" ) && sleep 0' 8238 1726882387.47319: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882387.47326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882387.47329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.47339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882387.47341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.47388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882387.47392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882387.47482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882387.49465: stdout chunk (state=3): >>>ansible-tmp-1726882387.4680665-8942-50936256699182=/root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182 <<< 8238 1726882387.49573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882387.49637: stderr chunk (state=3): >>><<< 8238 1726882387.49643: stdout chunk (state=3): >>><<< 8238 1726882387.49659: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882387.4680665-8942-50936256699182=/root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882387.49703: variable 'ansible_module_compression' from source: unknown 8238 1726882387.49753: ANSIBALLZ: Using lock for package_facts 8238 1726882387.49756: ANSIBALLZ: Acquiring lock 8238 1726882387.49759: ANSIBALLZ: Lock acquired: 140036200589040 8238 1726882387.49762: ANSIBALLZ: Creating module 8238 1726882387.72118: ANSIBALLZ: Writing module into payload 8238 1726882387.72228: ANSIBALLZ: Writing module 8238 1726882387.72255: ANSIBALLZ: Renaming module 8238 1726882387.72259: ANSIBALLZ: Done creating module 8238 1726882387.72280: variable 'ansible_facts' from source: unknown 8238 1726882387.72404: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/AnsiballZ_package_facts.py 8238 1726882387.72520: Sending initial data 8238 1726882387.72527: Sent initial data (159 bytes) 8238 1726882387.73052: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882387.73056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.73058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882387.73061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882387.73063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.73115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882387.73118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882387.73121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882387.73216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882387.74938: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8238 1726882387.74945: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882387.75023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882387.75104: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp34rtjdkp /root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/AnsiballZ_package_facts.py <<< 8238 1726882387.75112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/AnsiballZ_package_facts.py" <<< 8238 1726882387.75186: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp34rtjdkp" to remote "/root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/AnsiballZ_package_facts.py" <<< 8238 1726882387.75193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/AnsiballZ_package_facts.py" <<< 8238 1726882387.76472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882387.76544: stderr chunk (state=3): >>><<< 8238 1726882387.76547: stdout chunk (state=3): >>><<< 8238 1726882387.76569: done transferring module to remote 8238 1726882387.76579: _low_level_execute_command(): starting 8238 1726882387.76584: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/ /root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/AnsiballZ_package_facts.py && sleep 0' 8238 1726882387.77055: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882387.77058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882387.77065: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.77070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882387.77088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.77128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882387.77132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882387.77145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882387.77226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882387.79130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882387.79136: stderr chunk (state=3): >>><<< 8238 1726882387.79139: stdout chunk (state=3): >>><<< 8238 1726882387.79152: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882387.79158: _low_level_execute_command(): starting 8238 1726882387.79164: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/AnsiballZ_package_facts.py && sleep 0' 8238 1726882387.79629: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882387.79633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882387.79636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882387.79638: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882387.79640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882387.79692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882387.79699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882387.79788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882388.42700: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 8238 1726882388.42729: stdout chunk (state=3): >>>me": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 8238 1726882388.42734: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 8238 1726882388.42780: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "sourc<<< 8238 1726882388.42795: stdout chunk (state=3): >>>e": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 8238 1726882388.42807: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1",<<< 8238 1726882388.42820: stdout chunk (state=3): >>> "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_6<<< 8238 1726882388.42846: stdout chunk (state=3): >>>4", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rp<<< 8238 1726882388.42881: stdout chunk (state=3): >>>m"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], <<< 8238 1726882388.42890: stdout chunk (state=3): >>>"perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "<<< 8238 1726882388.42898: stdout chunk (state=3): >>>arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch":<<< 8238 1726882388.42911: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 8238 1726882388.44828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882388.44890: stderr chunk (state=3): >>><<< 8238 1726882388.44894: stdout chunk (state=3): >>><<< 8238 1726882388.44939: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882388.46798: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882388.46818: _low_level_execute_command(): starting 8238 1726882388.46824: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882387.4680665-8942-50936256699182/ > /dev/null 2>&1 && sleep 0' 8238 1726882388.47333: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882388.47337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882388.47339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882388.47342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882388.47402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882388.47411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882388.47414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882388.47506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882388.49477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882388.49539: stderr chunk (state=3): >>><<< 8238 1726882388.49543: stdout chunk (state=3): >>><<< 8238 1726882388.49559: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882388.49567: handler run complete 8238 1726882388.50190: variable 'ansible_facts' from source: unknown 8238 1726882388.50524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882388.52053: variable 'ansible_facts' from source: unknown 8238 1726882388.52371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882388.52927: attempt loop complete, returning result 8238 1726882388.52940: _execute() done 8238 1726882388.52944: dumping result to json 8238 1726882388.53093: done dumping result, returning 8238 1726882388.53102: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-54bc-d334-00000000018e] 8238 1726882388.53107: sending task result for task 0affc7ec-ae25-54bc-d334-00000000018e ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882388.54930: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000018e 8238 1726882388.54933: WORKER PROCESS EXITING 8238 1726882388.54942: no more pending results, returning what we have 8238 1726882388.54944: results queue empty 8238 1726882388.54945: checking for any_errors_fatal 8238 1726882388.54948: done checking for any_errors_fatal 8238 1726882388.54948: checking for max_fail_percentage 8238 1726882388.54951: done checking for max_fail_percentage 8238 1726882388.54952: checking to see if all hosts have failed and the running result is not ok 8238 1726882388.54953: done checking to see if all hosts have failed 8238 1726882388.54953: getting the remaining hosts for this loop 8238 1726882388.54954: done getting the remaining hosts for this loop 8238 1726882388.54957: getting the next task for host managed_node3 8238 1726882388.54963: done getting next task for host managed_node3 8238 1726882388.54965: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 8238 1726882388.54968: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882388.54975: getting variables 8238 1726882388.54978: in VariableManager get_vars() 8238 1726882388.55005: Calling all_inventory to load vars for managed_node3 8238 1726882388.55007: Calling groups_inventory to load vars for managed_node3 8238 1726882388.55008: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882388.55015: Calling all_plugins_play to load vars for managed_node3 8238 1726882388.55017: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882388.55019: Calling groups_plugins_play to load vars for managed_node3 8238 1726882388.56176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882388.58392: done with get_vars() 8238 1726882388.58428: done getting variables 8238 1726882388.58503: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:08 -0400 (0:00:01.159) 0:00:18.740 ****** 8238 1726882388.58553: entering _queue_task() for managed_node3/debug 8238 1726882388.59202: worker is 1 (out of 1 available) 8238 1726882388.59219: exiting _queue_task() for managed_node3/debug 8238 1726882388.59331: done queuing things up, now waiting for results queue to drain 8238 1726882388.59333: waiting for pending results... 8238 1726882388.60193: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 8238 1726882388.60524: in run() - task 0affc7ec-ae25-54bc-d334-000000000027 8238 1726882388.60730: variable 'ansible_search_path' from source: unknown 8238 1726882388.60734: variable 'ansible_search_path' from source: unknown 8238 1726882388.60737: calling self._execute() 8238 1726882388.60739: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882388.60742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882388.60745: variable 'omit' from source: magic vars 8238 1726882388.61740: variable 'ansible_distribution_major_version' from source: facts 8238 1726882388.61768: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882388.61784: variable 'omit' from source: magic vars 8238 1726882388.61860: variable 'omit' from source: magic vars 8238 1726882388.61982: variable 'network_provider' from source: set_fact 8238 1726882388.62011: variable 'omit' from source: magic vars 8238 1726882388.62071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882388.62118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882388.62149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882388.62180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882388.62199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882388.62243: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882388.62257: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882388.62270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882388.62394: Set connection var ansible_connection to ssh 8238 1726882388.62405: Set connection var ansible_shell_type to sh 8238 1726882388.62418: Set connection var ansible_pipelining to False 8238 1726882388.62435: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882388.62448: Set connection var ansible_timeout to 10 8238 1726882388.62466: Set connection var ansible_shell_executable to /bin/sh 8238 1726882388.62498: variable 'ansible_shell_executable' from source: unknown 8238 1726882388.62509: variable 'ansible_connection' from source: unknown 8238 1726882388.62628: variable 'ansible_module_compression' from source: unknown 8238 1726882388.62632: variable 'ansible_shell_type' from source: unknown 8238 1726882388.62636: variable 'ansible_shell_executable' from source: unknown 8238 1726882388.62638: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882388.62640: variable 'ansible_pipelining' from source: unknown 8238 1726882388.62643: variable 'ansible_timeout' from source: unknown 8238 1726882388.62645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882388.62731: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882388.62753: variable 'omit' from source: magic vars 8238 1726882388.62764: starting attempt loop 8238 1726882388.62770: running the handler 8238 1726882388.62823: handler run complete 8238 1726882388.62844: attempt loop complete, returning result 8238 1726882388.62854: _execute() done 8238 1726882388.62863: dumping result to json 8238 1726882388.62870: done dumping result, returning 8238 1726882388.62882: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-54bc-d334-000000000027] 8238 1726882388.62891: sending task result for task 0affc7ec-ae25-54bc-d334-000000000027 ok: [managed_node3] => {} MSG: Using network provider: nm 8238 1726882388.63070: no more pending results, returning what we have 8238 1726882388.63073: results queue empty 8238 1726882388.63074: checking for any_errors_fatal 8238 1726882388.63084: done checking for any_errors_fatal 8238 1726882388.63084: checking for max_fail_percentage 8238 1726882388.63086: done checking for max_fail_percentage 8238 1726882388.63087: checking to see if all hosts have failed and the running result is not ok 8238 1726882388.63087: done checking to see if all hosts have failed 8238 1726882388.63088: getting the remaining hosts for this loop 8238 1726882388.63089: done getting the remaining hosts for this loop 8238 1726882388.63094: getting the next task for host managed_node3 8238 1726882388.63101: done getting next task for host managed_node3 8238 1726882388.63104: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8238 1726882388.63108: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882388.63118: getting variables 8238 1726882388.63120: in VariableManager get_vars() 8238 1726882388.63166: Calling all_inventory to load vars for managed_node3 8238 1726882388.63170: Calling groups_inventory to load vars for managed_node3 8238 1726882388.63172: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882388.63182: Calling all_plugins_play to load vars for managed_node3 8238 1726882388.63185: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882388.63187: Calling groups_plugins_play to load vars for managed_node3 8238 1726882388.63749: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000027 8238 1726882388.63755: WORKER PROCESS EXITING 8238 1726882388.65628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882388.69516: done with get_vars() 8238 1726882388.69558: done getting variables 8238 1726882388.69865: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:08 -0400 (0:00:00.113) 0:00:18.854 ****** 8238 1726882388.69900: entering _queue_task() for managed_node3/fail 8238 1726882388.69902: Creating lock for fail 8238 1726882388.70583: worker is 1 (out of 1 available) 8238 1726882388.70597: exiting _queue_task() for managed_node3/fail 8238 1726882388.70612: done queuing things up, now waiting for results queue to drain 8238 1726882388.70614: waiting for pending results... 8238 1726882388.71029: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8238 1726882388.71337: in run() - task 0affc7ec-ae25-54bc-d334-000000000028 8238 1726882388.71365: variable 'ansible_search_path' from source: unknown 8238 1726882388.71374: variable 'ansible_search_path' from source: unknown 8238 1726882388.71417: calling self._execute() 8238 1726882388.71613: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882388.71629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882388.71646: variable 'omit' from source: magic vars 8238 1726882388.72076: variable 'ansible_distribution_major_version' from source: facts 8238 1726882388.72097: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882388.72241: variable 'network_state' from source: role '' defaults 8238 1726882388.72259: Evaluated conditional (network_state != {}): False 8238 1726882388.72267: when evaluation is False, skipping this task 8238 1726882388.72275: _execute() done 8238 1726882388.72283: dumping result to json 8238 1726882388.72292: done dumping result, returning 8238 1726882388.72310: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-54bc-d334-000000000028] 8238 1726882388.72325: sending task result for task 0affc7ec-ae25-54bc-d334-000000000028 8238 1726882388.72564: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000028 8238 1726882388.72567: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882388.72679: no more pending results, returning what we have 8238 1726882388.72683: results queue empty 8238 1726882388.72684: checking for any_errors_fatal 8238 1726882388.72690: done checking for any_errors_fatal 8238 1726882388.72690: checking for max_fail_percentage 8238 1726882388.72692: done checking for max_fail_percentage 8238 1726882388.72693: checking to see if all hosts have failed and the running result is not ok 8238 1726882388.72693: done checking to see if all hosts have failed 8238 1726882388.72694: getting the remaining hosts for this loop 8238 1726882388.72695: done getting the remaining hosts for this loop 8238 1726882388.72699: getting the next task for host managed_node3 8238 1726882388.72705: done getting next task for host managed_node3 8238 1726882388.72708: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8238 1726882388.72712: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882388.72729: getting variables 8238 1726882388.72730: in VariableManager get_vars() 8238 1726882388.72768: Calling all_inventory to load vars for managed_node3 8238 1726882388.72771: Calling groups_inventory to load vars for managed_node3 8238 1726882388.72773: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882388.72782: Calling all_plugins_play to load vars for managed_node3 8238 1726882388.72785: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882388.72788: Calling groups_plugins_play to load vars for managed_node3 8238 1726882388.76380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882388.78706: done with get_vars() 8238 1726882388.78736: done getting variables 8238 1726882388.78788: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:08 -0400 (0:00:00.089) 0:00:18.943 ****** 8238 1726882388.78815: entering _queue_task() for managed_node3/fail 8238 1726882388.79083: worker is 1 (out of 1 available) 8238 1726882388.79096: exiting _queue_task() for managed_node3/fail 8238 1726882388.79110: done queuing things up, now waiting for results queue to drain 8238 1726882388.79112: waiting for pending results... 8238 1726882388.79300: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8238 1726882388.79393: in run() - task 0affc7ec-ae25-54bc-d334-000000000029 8238 1726882388.79405: variable 'ansible_search_path' from source: unknown 8238 1726882388.79409: variable 'ansible_search_path' from source: unknown 8238 1726882388.79443: calling self._execute() 8238 1726882388.79539: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882388.79543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882388.79552: variable 'omit' from source: magic vars 8238 1726882388.80028: variable 'ansible_distribution_major_version' from source: facts 8238 1726882388.80032: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882388.80134: variable 'network_state' from source: role '' defaults 8238 1726882388.80151: Evaluated conditional (network_state != {}): False 8238 1726882388.80158: when evaluation is False, skipping this task 8238 1726882388.80166: _execute() done 8238 1726882388.80182: dumping result to json 8238 1726882388.80190: done dumping result, returning 8238 1726882388.80202: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-54bc-d334-000000000029] 8238 1726882388.80285: sending task result for task 0affc7ec-ae25-54bc-d334-000000000029 8238 1726882388.80373: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000029 8238 1726882388.80377: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882388.80433: no more pending results, returning what we have 8238 1726882388.80438: results queue empty 8238 1726882388.80439: checking for any_errors_fatal 8238 1726882388.80450: done checking for any_errors_fatal 8238 1726882388.80450: checking for max_fail_percentage 8238 1726882388.80452: done checking for max_fail_percentage 8238 1726882388.80453: checking to see if all hosts have failed and the running result is not ok 8238 1726882388.80454: done checking to see if all hosts have failed 8238 1726882388.80455: getting the remaining hosts for this loop 8238 1726882388.80456: done getting the remaining hosts for this loop 8238 1726882388.80461: getting the next task for host managed_node3 8238 1726882388.80469: done getting next task for host managed_node3 8238 1726882388.80473: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8238 1726882388.80478: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882388.80496: getting variables 8238 1726882388.80498: in VariableManager get_vars() 8238 1726882388.80544: Calling all_inventory to load vars for managed_node3 8238 1726882388.80548: Calling groups_inventory to load vars for managed_node3 8238 1726882388.80550: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882388.80564: Calling all_plugins_play to load vars for managed_node3 8238 1726882388.80567: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882388.80571: Calling groups_plugins_play to load vars for managed_node3 8238 1726882388.82389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882388.83784: done with get_vars() 8238 1726882388.83816: done getting variables 8238 1726882388.83886: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:08 -0400 (0:00:00.051) 0:00:18.994 ****** 8238 1726882388.83924: entering _queue_task() for managed_node3/fail 8238 1726882388.84515: worker is 1 (out of 1 available) 8238 1726882388.84632: exiting _queue_task() for managed_node3/fail 8238 1726882388.84643: done queuing things up, now waiting for results queue to drain 8238 1726882388.84645: waiting for pending results... 8238 1726882388.85176: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8238 1726882388.85373: in run() - task 0affc7ec-ae25-54bc-d334-00000000002a 8238 1726882388.85378: variable 'ansible_search_path' from source: unknown 8238 1726882388.85381: variable 'ansible_search_path' from source: unknown 8238 1726882388.85481: calling self._execute() 8238 1726882388.85506: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882388.85519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882388.85541: variable 'omit' from source: magic vars 8238 1726882388.85992: variable 'ansible_distribution_major_version' from source: facts 8238 1726882388.86061: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882388.86219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882388.89197: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882388.89201: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882388.89246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882388.89297: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882388.89337: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882388.89529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882388.89534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882388.89536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882388.89567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882388.89587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882388.89690: variable 'ansible_distribution_major_version' from source: facts 8238 1726882388.89703: Evaluated conditional (ansible_distribution_major_version | int > 9): True 8238 1726882388.89836: variable 'ansible_distribution' from source: facts 8238 1726882388.89840: variable '__network_rh_distros' from source: role '' defaults 8238 1726882388.89854: Evaluated conditional (ansible_distribution in __network_rh_distros): False 8238 1726882388.89858: when evaluation is False, skipping this task 8238 1726882388.89862: _execute() done 8238 1726882388.89865: dumping result to json 8238 1726882388.89870: done dumping result, returning 8238 1726882388.89887: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-54bc-d334-00000000002a] 8238 1726882388.89891: sending task result for task 0affc7ec-ae25-54bc-d334-00000000002a skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 8238 1726882388.90047: no more pending results, returning what we have 8238 1726882388.90054: results queue empty 8238 1726882388.90056: checking for any_errors_fatal 8238 1726882388.90061: done checking for any_errors_fatal 8238 1726882388.90062: checking for max_fail_percentage 8238 1726882388.90063: done checking for max_fail_percentage 8238 1726882388.90064: checking to see if all hosts have failed and the running result is not ok 8238 1726882388.90065: done checking to see if all hosts have failed 8238 1726882388.90066: getting the remaining hosts for this loop 8238 1726882388.90067: done getting the remaining hosts for this loop 8238 1726882388.90072: getting the next task for host managed_node3 8238 1726882388.90079: done getting next task for host managed_node3 8238 1726882388.90083: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8238 1726882388.90087: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882388.90102: getting variables 8238 1726882388.90104: in VariableManager get_vars() 8238 1726882388.90148: Calling all_inventory to load vars for managed_node3 8238 1726882388.90154: Calling groups_inventory to load vars for managed_node3 8238 1726882388.90156: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882388.90166: Calling all_plugins_play to load vars for managed_node3 8238 1726882388.90169: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882388.90171: Calling groups_plugins_play to load vars for managed_node3 8238 1726882388.90741: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000002a 8238 1726882388.90745: WORKER PROCESS EXITING 8238 1726882388.92262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882388.93429: done with get_vars() 8238 1726882388.93453: done getting variables 8238 1726882388.93561: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:08 -0400 (0:00:00.096) 0:00:19.091 ****** 8238 1726882388.93594: entering _queue_task() for managed_node3/dnf 8238 1726882388.94065: worker is 1 (out of 1 available) 8238 1726882388.94078: exiting _queue_task() for managed_node3/dnf 8238 1726882388.94090: done queuing things up, now waiting for results queue to drain 8238 1726882388.94092: waiting for pending results... 8238 1726882388.94341: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8238 1726882388.94506: in run() - task 0affc7ec-ae25-54bc-d334-00000000002b 8238 1726882388.94532: variable 'ansible_search_path' from source: unknown 8238 1726882388.94542: variable 'ansible_search_path' from source: unknown 8238 1726882388.94590: calling self._execute() 8238 1726882388.94685: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882388.94690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882388.94699: variable 'omit' from source: magic vars 8238 1726882388.95007: variable 'ansible_distribution_major_version' from source: facts 8238 1726882388.95016: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882388.95170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882388.97048: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882388.97105: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882388.97137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882388.97165: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882388.97185: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882388.97257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882388.97277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882388.97296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882388.97326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882388.97341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882388.97428: variable 'ansible_distribution' from source: facts 8238 1726882388.97431: variable 'ansible_distribution_major_version' from source: facts 8238 1726882388.97444: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 8238 1726882388.97529: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882388.97626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882388.97645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882388.97669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882388.97695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882388.97706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882388.97739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882388.97757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882388.97927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882388.97930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882388.97933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882388.97935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882388.97937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882388.97940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882388.97942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882388.97945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882388.98028: variable 'network_connections' from source: task vars 8238 1726882388.98039: variable 'controller_profile' from source: play vars 8238 1726882388.98090: variable 'controller_profile' from source: play vars 8238 1726882388.98101: variable 'controller_device' from source: play vars 8238 1726882388.98146: variable 'controller_device' from source: play vars 8238 1726882388.98157: variable 'port1_profile' from source: play vars 8238 1726882388.98204: variable 'port1_profile' from source: play vars 8238 1726882388.98207: variable 'dhcp_interface1' from source: play vars 8238 1726882388.98259: variable 'dhcp_interface1' from source: play vars 8238 1726882388.98263: variable 'controller_profile' from source: play vars 8238 1726882388.98307: variable 'controller_profile' from source: play vars 8238 1726882388.98314: variable 'port2_profile' from source: play vars 8238 1726882388.98364: variable 'port2_profile' from source: play vars 8238 1726882388.98370: variable 'dhcp_interface2' from source: play vars 8238 1726882388.98415: variable 'dhcp_interface2' from source: play vars 8238 1726882388.98419: variable 'controller_profile' from source: play vars 8238 1726882388.98471: variable 'controller_profile' from source: play vars 8238 1726882388.98526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882388.98654: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882388.98697: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882388.98723: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882388.98746: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882388.98785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882388.98801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882388.98819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882388.98841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882388.98895: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882388.99071: variable 'network_connections' from source: task vars 8238 1726882388.99075: variable 'controller_profile' from source: play vars 8238 1726882388.99125: variable 'controller_profile' from source: play vars 8238 1726882388.99131: variable 'controller_device' from source: play vars 8238 1726882388.99178: variable 'controller_device' from source: play vars 8238 1726882388.99187: variable 'port1_profile' from source: play vars 8238 1726882388.99235: variable 'port1_profile' from source: play vars 8238 1726882388.99241: variable 'dhcp_interface1' from source: play vars 8238 1726882388.99287: variable 'dhcp_interface1' from source: play vars 8238 1726882388.99293: variable 'controller_profile' from source: play vars 8238 1726882388.99341: variable 'controller_profile' from source: play vars 8238 1726882388.99347: variable 'port2_profile' from source: play vars 8238 1726882388.99393: variable 'port2_profile' from source: play vars 8238 1726882388.99399: variable 'dhcp_interface2' from source: play vars 8238 1726882388.99448: variable 'dhcp_interface2' from source: play vars 8238 1726882388.99456: variable 'controller_profile' from source: play vars 8238 1726882388.99499: variable 'controller_profile' from source: play vars 8238 1726882388.99528: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8238 1726882388.99531: when evaluation is False, skipping this task 8238 1726882388.99534: _execute() done 8238 1726882388.99541: dumping result to json 8238 1726882388.99544: done dumping result, returning 8238 1726882388.99550: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-54bc-d334-00000000002b] 8238 1726882388.99558: sending task result for task 0affc7ec-ae25-54bc-d334-00000000002b 8238 1726882388.99659: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000002b 8238 1726882388.99662: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8238 1726882388.99714: no more pending results, returning what we have 8238 1726882388.99717: results queue empty 8238 1726882388.99719: checking for any_errors_fatal 8238 1726882388.99727: done checking for any_errors_fatal 8238 1726882388.99728: checking for max_fail_percentage 8238 1726882388.99730: done checking for max_fail_percentage 8238 1726882388.99731: checking to see if all hosts have failed and the running result is not ok 8238 1726882388.99731: done checking to see if all hosts have failed 8238 1726882388.99732: getting the remaining hosts for this loop 8238 1726882388.99735: done getting the remaining hosts for this loop 8238 1726882388.99739: getting the next task for host managed_node3 8238 1726882388.99746: done getting next task for host managed_node3 8238 1726882388.99750: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8238 1726882388.99753: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882388.99768: getting variables 8238 1726882388.99770: in VariableManager get_vars() 8238 1726882388.99813: Calling all_inventory to load vars for managed_node3 8238 1726882388.99816: Calling groups_inventory to load vars for managed_node3 8238 1726882388.99818: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882388.99834: Calling all_plugins_play to load vars for managed_node3 8238 1726882388.99837: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882388.99841: Calling groups_plugins_play to load vars for managed_node3 8238 1726882389.00912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882389.02072: done with get_vars() 8238 1726882389.02092: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8238 1726882389.02162: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:09 -0400 (0:00:00.085) 0:00:19.176 ****** 8238 1726882389.02187: entering _queue_task() for managed_node3/yum 8238 1726882389.02189: Creating lock for yum 8238 1726882389.02468: worker is 1 (out of 1 available) 8238 1726882389.02485: exiting _queue_task() for managed_node3/yum 8238 1726882389.02499: done queuing things up, now waiting for results queue to drain 8238 1726882389.02501: waiting for pending results... 8238 1726882389.02677: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8238 1726882389.02774: in run() - task 0affc7ec-ae25-54bc-d334-00000000002c 8238 1726882389.02785: variable 'ansible_search_path' from source: unknown 8238 1726882389.02789: variable 'ansible_search_path' from source: unknown 8238 1726882389.02824: calling self._execute() 8238 1726882389.02896: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882389.02900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882389.02910: variable 'omit' from source: magic vars 8238 1726882389.03201: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.03212: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882389.03348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882389.04997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882389.05054: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882389.05081: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882389.05107: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882389.05131: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882389.05197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.05218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.05240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.05273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.05284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.05367: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.05380: Evaluated conditional (ansible_distribution_major_version | int < 8): False 8238 1726882389.05383: when evaluation is False, skipping this task 8238 1726882389.05386: _execute() done 8238 1726882389.05390: dumping result to json 8238 1726882389.05395: done dumping result, returning 8238 1726882389.05403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-54bc-d334-00000000002c] 8238 1726882389.05408: sending task result for task 0affc7ec-ae25-54bc-d334-00000000002c 8238 1726882389.05510: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000002c 8238 1726882389.05512: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 8238 1726882389.05568: no more pending results, returning what we have 8238 1726882389.05572: results queue empty 8238 1726882389.05573: checking for any_errors_fatal 8238 1726882389.05579: done checking for any_errors_fatal 8238 1726882389.05580: checking for max_fail_percentage 8238 1726882389.05581: done checking for max_fail_percentage 8238 1726882389.05582: checking to see if all hosts have failed and the running result is not ok 8238 1726882389.05583: done checking to see if all hosts have failed 8238 1726882389.05583: getting the remaining hosts for this loop 8238 1726882389.05585: done getting the remaining hosts for this loop 8238 1726882389.05589: getting the next task for host managed_node3 8238 1726882389.05596: done getting next task for host managed_node3 8238 1726882389.05600: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8238 1726882389.05605: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882389.05621: getting variables 8238 1726882389.05624: in VariableManager get_vars() 8238 1726882389.05667: Calling all_inventory to load vars for managed_node3 8238 1726882389.05670: Calling groups_inventory to load vars for managed_node3 8238 1726882389.05672: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882389.05681: Calling all_plugins_play to load vars for managed_node3 8238 1726882389.05683: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882389.05686: Calling groups_plugins_play to load vars for managed_node3 8238 1726882389.06771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882389.07945: done with get_vars() 8238 1726882389.07967: done getting variables 8238 1726882389.08019: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:09 -0400 (0:00:00.058) 0:00:19.235 ****** 8238 1726882389.08047: entering _queue_task() for managed_node3/fail 8238 1726882389.08312: worker is 1 (out of 1 available) 8238 1726882389.08329: exiting _queue_task() for managed_node3/fail 8238 1726882389.08342: done queuing things up, now waiting for results queue to drain 8238 1726882389.08344: waiting for pending results... 8238 1726882389.08543: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8238 1726882389.08641: in run() - task 0affc7ec-ae25-54bc-d334-00000000002d 8238 1726882389.08645: variable 'ansible_search_path' from source: unknown 8238 1726882389.08647: variable 'ansible_search_path' from source: unknown 8238 1726882389.08683: calling self._execute() 8238 1726882389.08760: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882389.08765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882389.08775: variable 'omit' from source: magic vars 8238 1726882389.09075: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.09085: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882389.09178: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882389.09327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882389.11493: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882389.11543: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882389.11575: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882389.11601: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882389.11625: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882389.11696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.11717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.11737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.11771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.11784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.11823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.11840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.11862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.11895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.11907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.11939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.11960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.11981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.12010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.12020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.12149: variable 'network_connections' from source: task vars 8238 1726882389.12163: variable 'controller_profile' from source: play vars 8238 1726882389.12219: variable 'controller_profile' from source: play vars 8238 1726882389.12228: variable 'controller_device' from source: play vars 8238 1726882389.12276: variable 'controller_device' from source: play vars 8238 1726882389.12285: variable 'port1_profile' from source: play vars 8238 1726882389.12333: variable 'port1_profile' from source: play vars 8238 1726882389.12339: variable 'dhcp_interface1' from source: play vars 8238 1726882389.12386: variable 'dhcp_interface1' from source: play vars 8238 1726882389.12392: variable 'controller_profile' from source: play vars 8238 1726882389.12441: variable 'controller_profile' from source: play vars 8238 1726882389.12447: variable 'port2_profile' from source: play vars 8238 1726882389.12493: variable 'port2_profile' from source: play vars 8238 1726882389.12499: variable 'dhcp_interface2' from source: play vars 8238 1726882389.12549: variable 'dhcp_interface2' from source: play vars 8238 1726882389.12557: variable 'controller_profile' from source: play vars 8238 1726882389.12600: variable 'controller_profile' from source: play vars 8238 1726882389.12661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882389.12800: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882389.12832: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882389.12861: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882389.12884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882389.12918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882389.12937: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882389.12960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.12982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882389.13127: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882389.13316: variable 'network_connections' from source: task vars 8238 1726882389.13330: variable 'controller_profile' from source: play vars 8238 1726882389.13396: variable 'controller_profile' from source: play vars 8238 1726882389.13406: variable 'controller_device' from source: play vars 8238 1726882389.13475: variable 'controller_device' from source: play vars 8238 1726882389.13488: variable 'port1_profile' from source: play vars 8238 1726882389.13558: variable 'port1_profile' from source: play vars 8238 1726882389.13570: variable 'dhcp_interface1' from source: play vars 8238 1726882389.13638: variable 'dhcp_interface1' from source: play vars 8238 1726882389.13653: variable 'controller_profile' from source: play vars 8238 1726882389.13724: variable 'controller_profile' from source: play vars 8238 1726882389.13928: variable 'port2_profile' from source: play vars 8238 1726882389.13931: variable 'port2_profile' from source: play vars 8238 1726882389.13933: variable 'dhcp_interface2' from source: play vars 8238 1726882389.13934: variable 'dhcp_interface2' from source: play vars 8238 1726882389.13936: variable 'controller_profile' from source: play vars 8238 1726882389.13938: variable 'controller_profile' from source: play vars 8238 1726882389.13981: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8238 1726882389.13989: when evaluation is False, skipping this task 8238 1726882389.13996: _execute() done 8238 1726882389.14006: dumping result to json 8238 1726882389.14014: done dumping result, returning 8238 1726882389.14028: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-54bc-d334-00000000002d] 8238 1726882389.14037: sending task result for task 0affc7ec-ae25-54bc-d334-00000000002d 8238 1726882389.14152: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000002d 8238 1726882389.14161: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8238 1726882389.14217: no more pending results, returning what we have 8238 1726882389.14221: results queue empty 8238 1726882389.14224: checking for any_errors_fatal 8238 1726882389.14229: done checking for any_errors_fatal 8238 1726882389.14230: checking for max_fail_percentage 8238 1726882389.14231: done checking for max_fail_percentage 8238 1726882389.14232: checking to see if all hosts have failed and the running result is not ok 8238 1726882389.14233: done checking to see if all hosts have failed 8238 1726882389.14234: getting the remaining hosts for this loop 8238 1726882389.14235: done getting the remaining hosts for this loop 8238 1726882389.14240: getting the next task for host managed_node3 8238 1726882389.14247: done getting next task for host managed_node3 8238 1726882389.14251: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 8238 1726882389.14254: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882389.14269: getting variables 8238 1726882389.14270: in VariableManager get_vars() 8238 1726882389.14312: Calling all_inventory to load vars for managed_node3 8238 1726882389.14314: Calling groups_inventory to load vars for managed_node3 8238 1726882389.14316: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882389.14332: Calling all_plugins_play to load vars for managed_node3 8238 1726882389.14335: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882389.14338: Calling groups_plugins_play to load vars for managed_node3 8238 1726882389.16111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882389.18216: done with get_vars() 8238 1726882389.18256: done getting variables 8238 1726882389.18334: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:09 -0400 (0:00:00.103) 0:00:19.338 ****** 8238 1726882389.18376: entering _queue_task() for managed_node3/package 8238 1726882389.18959: worker is 1 (out of 1 available) 8238 1726882389.18971: exiting _queue_task() for managed_node3/package 8238 1726882389.18983: done queuing things up, now waiting for results queue to drain 8238 1726882389.18985: waiting for pending results... 8238 1726882389.19067: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 8238 1726882389.19213: in run() - task 0affc7ec-ae25-54bc-d334-00000000002e 8238 1726882389.19237: variable 'ansible_search_path' from source: unknown 8238 1726882389.19244: variable 'ansible_search_path' from source: unknown 8238 1726882389.19287: calling self._execute() 8238 1726882389.19401: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882389.19414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882389.19543: variable 'omit' from source: magic vars 8238 1726882389.19875: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.19895: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882389.20127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882389.20443: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882389.20493: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882389.20542: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882389.20583: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882389.20713: variable 'network_packages' from source: role '' defaults 8238 1726882389.20847: variable '__network_provider_setup' from source: role '' defaults 8238 1726882389.20868: variable '__network_service_name_default_nm' from source: role '' defaults 8238 1726882389.20931: variable '__network_service_name_default_nm' from source: role '' defaults 8238 1726882389.20943: variable '__network_packages_default_nm' from source: role '' defaults 8238 1726882389.21013: variable '__network_packages_default_nm' from source: role '' defaults 8238 1726882389.21229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882389.28127: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882389.28197: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882389.28262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882389.28362: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882389.28366: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882389.28369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.28425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.28428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.28470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.28486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.28600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.28603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.28606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.28758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.28778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.29053: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8238 1726882389.29196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.29231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.29310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.29315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.29331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.29402: variable 'ansible_python' from source: facts 8238 1726882389.29424: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8238 1726882389.29491: variable '__network_wpa_supplicant_required' from source: role '' defaults 8238 1726882389.29551: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8238 1726882389.29646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.29666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.29684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.29715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.29729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.29767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.29787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.29807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.29838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.29849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.29957: variable 'network_connections' from source: task vars 8238 1726882389.29960: variable 'controller_profile' from source: play vars 8238 1726882389.30037: variable 'controller_profile' from source: play vars 8238 1726882389.30045: variable 'controller_device' from source: play vars 8238 1726882389.30117: variable 'controller_device' from source: play vars 8238 1726882389.30129: variable 'port1_profile' from source: play vars 8238 1726882389.30204: variable 'port1_profile' from source: play vars 8238 1726882389.30212: variable 'dhcp_interface1' from source: play vars 8238 1726882389.30289: variable 'dhcp_interface1' from source: play vars 8238 1726882389.30296: variable 'controller_profile' from source: play vars 8238 1726882389.30372: variable 'controller_profile' from source: play vars 8238 1726882389.30381: variable 'port2_profile' from source: play vars 8238 1726882389.30452: variable 'port2_profile' from source: play vars 8238 1726882389.30464: variable 'dhcp_interface2' from source: play vars 8238 1726882389.30536: variable 'dhcp_interface2' from source: play vars 8238 1726882389.30543: variable 'controller_profile' from source: play vars 8238 1726882389.30619: variable 'controller_profile' from source: play vars 8238 1726882389.30674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882389.30697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882389.30719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.30746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882389.30779: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882389.30991: variable 'network_connections' from source: task vars 8238 1726882389.30994: variable 'controller_profile' from source: play vars 8238 1726882389.31071: variable 'controller_profile' from source: play vars 8238 1726882389.31079: variable 'controller_device' from source: play vars 8238 1726882389.31154: variable 'controller_device' from source: play vars 8238 1726882389.31165: variable 'port1_profile' from source: play vars 8238 1726882389.31240: variable 'port1_profile' from source: play vars 8238 1726882389.31248: variable 'dhcp_interface1' from source: play vars 8238 1726882389.31338: variable 'dhcp_interface1' from source: play vars 8238 1726882389.31342: variable 'controller_profile' from source: play vars 8238 1726882389.31528: variable 'controller_profile' from source: play vars 8238 1726882389.31532: variable 'port2_profile' from source: play vars 8238 1726882389.31571: variable 'port2_profile' from source: play vars 8238 1726882389.31574: variable 'dhcp_interface2' from source: play vars 8238 1726882389.31650: variable 'dhcp_interface2' from source: play vars 8238 1726882389.31672: variable 'controller_profile' from source: play vars 8238 1726882389.31842: variable 'controller_profile' from source: play vars 8238 1726882389.31845: variable '__network_packages_default_wireless' from source: role '' defaults 8238 1726882389.31898: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882389.32226: variable 'network_connections' from source: task vars 8238 1726882389.32229: variable 'controller_profile' from source: play vars 8238 1726882389.32299: variable 'controller_profile' from source: play vars 8238 1726882389.32307: variable 'controller_device' from source: play vars 8238 1726882389.32383: variable 'controller_device' from source: play vars 8238 1726882389.32390: variable 'port1_profile' from source: play vars 8238 1726882389.32458: variable 'port1_profile' from source: play vars 8238 1726882389.32494: variable 'dhcp_interface1' from source: play vars 8238 1726882389.32528: variable 'dhcp_interface1' from source: play vars 8238 1726882389.32535: variable 'controller_profile' from source: play vars 8238 1726882389.32603: variable 'controller_profile' from source: play vars 8238 1726882389.32614: variable 'port2_profile' from source: play vars 8238 1726882389.32678: variable 'port2_profile' from source: play vars 8238 1726882389.32725: variable 'dhcp_interface2' from source: play vars 8238 1726882389.32749: variable 'dhcp_interface2' from source: play vars 8238 1726882389.32759: variable 'controller_profile' from source: play vars 8238 1726882389.32830: variable 'controller_profile' from source: play vars 8238 1726882389.32893: variable '__network_packages_default_team' from source: role '' defaults 8238 1726882389.32938: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882389.33270: variable 'network_connections' from source: task vars 8238 1726882389.33281: variable 'controller_profile' from source: play vars 8238 1726882389.33321: variable 'controller_profile' from source: play vars 8238 1726882389.33329: variable 'controller_device' from source: play vars 8238 1726882389.33398: variable 'controller_device' from source: play vars 8238 1726882389.33401: variable 'port1_profile' from source: play vars 8238 1726882389.33446: variable 'port1_profile' from source: play vars 8238 1726882389.33452: variable 'dhcp_interface1' from source: play vars 8238 1726882389.33503: variable 'dhcp_interface1' from source: play vars 8238 1726882389.33506: variable 'controller_profile' from source: play vars 8238 1726882389.33559: variable 'controller_profile' from source: play vars 8238 1726882389.33565: variable 'port2_profile' from source: play vars 8238 1726882389.33611: variable 'port2_profile' from source: play vars 8238 1726882389.33625: variable 'dhcp_interface2' from source: play vars 8238 1726882389.33671: variable 'dhcp_interface2' from source: play vars 8238 1726882389.33677: variable 'controller_profile' from source: play vars 8238 1726882389.33723: variable 'controller_profile' from source: play vars 8238 1726882389.33775: variable '__network_service_name_default_initscripts' from source: role '' defaults 8238 1726882389.33817: variable '__network_service_name_default_initscripts' from source: role '' defaults 8238 1726882389.33826: variable '__network_packages_default_initscripts' from source: role '' defaults 8238 1726882389.33874: variable '__network_packages_default_initscripts' from source: role '' defaults 8238 1726882389.34026: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8238 1726882389.34351: variable 'network_connections' from source: task vars 8238 1726882389.34359: variable 'controller_profile' from source: play vars 8238 1726882389.34407: variable 'controller_profile' from source: play vars 8238 1726882389.34415: variable 'controller_device' from source: play vars 8238 1726882389.34461: variable 'controller_device' from source: play vars 8238 1726882389.34468: variable 'port1_profile' from source: play vars 8238 1726882389.34514: variable 'port1_profile' from source: play vars 8238 1726882389.34520: variable 'dhcp_interface1' from source: play vars 8238 1726882389.34567: variable 'dhcp_interface1' from source: play vars 8238 1726882389.34573: variable 'controller_profile' from source: play vars 8238 1726882389.34619: variable 'controller_profile' from source: play vars 8238 1726882389.34627: variable 'port2_profile' from source: play vars 8238 1726882389.34672: variable 'port2_profile' from source: play vars 8238 1726882389.34678: variable 'dhcp_interface2' from source: play vars 8238 1726882389.34726: variable 'dhcp_interface2' from source: play vars 8238 1726882389.34732: variable 'controller_profile' from source: play vars 8238 1726882389.34777: variable 'controller_profile' from source: play vars 8238 1726882389.34784: variable 'ansible_distribution' from source: facts 8238 1726882389.34787: variable '__network_rh_distros' from source: role '' defaults 8238 1726882389.34793: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.34826: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8238 1726882389.35000: variable 'ansible_distribution' from source: facts 8238 1726882389.35004: variable '__network_rh_distros' from source: role '' defaults 8238 1726882389.35007: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.35009: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8238 1726882389.35333: variable 'ansible_distribution' from source: facts 8238 1726882389.35336: variable '__network_rh_distros' from source: role '' defaults 8238 1726882389.35338: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.35341: variable 'network_provider' from source: set_fact 8238 1726882389.35343: variable 'ansible_facts' from source: unknown 8238 1726882389.35986: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 8238 1726882389.35989: when evaluation is False, skipping this task 8238 1726882389.35992: _execute() done 8238 1726882389.35994: dumping result to json 8238 1726882389.35997: done dumping result, returning 8238 1726882389.36008: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-54bc-d334-00000000002e] 8238 1726882389.36011: sending task result for task 0affc7ec-ae25-54bc-d334-00000000002e 8238 1726882389.36117: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000002e 8238 1726882389.36119: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 8238 1726882389.36166: no more pending results, returning what we have 8238 1726882389.36169: results queue empty 8238 1726882389.36170: checking for any_errors_fatal 8238 1726882389.36176: done checking for any_errors_fatal 8238 1726882389.36176: checking for max_fail_percentage 8238 1726882389.36178: done checking for max_fail_percentage 8238 1726882389.36179: checking to see if all hosts have failed and the running result is not ok 8238 1726882389.36179: done checking to see if all hosts have failed 8238 1726882389.36180: getting the remaining hosts for this loop 8238 1726882389.36181: done getting the remaining hosts for this loop 8238 1726882389.36185: getting the next task for host managed_node3 8238 1726882389.36191: done getting next task for host managed_node3 8238 1726882389.36195: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8238 1726882389.36198: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882389.36212: getting variables 8238 1726882389.36216: in VariableManager get_vars() 8238 1726882389.36257: Calling all_inventory to load vars for managed_node3 8238 1726882389.36260: Calling groups_inventory to load vars for managed_node3 8238 1726882389.36262: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882389.36271: Calling all_plugins_play to load vars for managed_node3 8238 1726882389.36274: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882389.36277: Calling groups_plugins_play to load vars for managed_node3 8238 1726882389.41031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882389.42184: done with get_vars() 8238 1726882389.42207: done getting variables 8238 1726882389.42255: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:09 -0400 (0:00:00.239) 0:00:19.577 ****** 8238 1726882389.42279: entering _queue_task() for managed_node3/package 8238 1726882389.42554: worker is 1 (out of 1 available) 8238 1726882389.42568: exiting _queue_task() for managed_node3/package 8238 1726882389.42581: done queuing things up, now waiting for results queue to drain 8238 1726882389.42583: waiting for pending results... 8238 1726882389.42772: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8238 1726882389.42880: in run() - task 0affc7ec-ae25-54bc-d334-00000000002f 8238 1726882389.42893: variable 'ansible_search_path' from source: unknown 8238 1726882389.42898: variable 'ansible_search_path' from source: unknown 8238 1726882389.42932: calling self._execute() 8238 1726882389.43010: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882389.43014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882389.43026: variable 'omit' from source: magic vars 8238 1726882389.43331: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.43341: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882389.43433: variable 'network_state' from source: role '' defaults 8238 1726882389.43442: Evaluated conditional (network_state != {}): False 8238 1726882389.43446: when evaluation is False, skipping this task 8238 1726882389.43448: _execute() done 8238 1726882389.43454: dumping result to json 8238 1726882389.43457: done dumping result, returning 8238 1726882389.43470: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-54bc-d334-00000000002f] 8238 1726882389.43475: sending task result for task 0affc7ec-ae25-54bc-d334-00000000002f 8238 1726882389.43580: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000002f 8238 1726882389.43584: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882389.43639: no more pending results, returning what we have 8238 1726882389.43643: results queue empty 8238 1726882389.43644: checking for any_errors_fatal 8238 1726882389.43655: done checking for any_errors_fatal 8238 1726882389.43656: checking for max_fail_percentage 8238 1726882389.43657: done checking for max_fail_percentage 8238 1726882389.43658: checking to see if all hosts have failed and the running result is not ok 8238 1726882389.43659: done checking to see if all hosts have failed 8238 1726882389.43660: getting the remaining hosts for this loop 8238 1726882389.43661: done getting the remaining hosts for this loop 8238 1726882389.43665: getting the next task for host managed_node3 8238 1726882389.43671: done getting next task for host managed_node3 8238 1726882389.43675: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8238 1726882389.43678: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882389.43700: getting variables 8238 1726882389.43702: in VariableManager get_vars() 8238 1726882389.43739: Calling all_inventory to load vars for managed_node3 8238 1726882389.43742: Calling groups_inventory to load vars for managed_node3 8238 1726882389.43744: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882389.43756: Calling all_plugins_play to load vars for managed_node3 8238 1726882389.43758: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882389.43761: Calling groups_plugins_play to load vars for managed_node3 8238 1726882389.44842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882389.46030: done with get_vars() 8238 1726882389.46056: done getting variables 8238 1726882389.46108: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:09 -0400 (0:00:00.038) 0:00:19.616 ****** 8238 1726882389.46136: entering _queue_task() for managed_node3/package 8238 1726882389.46407: worker is 1 (out of 1 available) 8238 1726882389.46424: exiting _queue_task() for managed_node3/package 8238 1726882389.46438: done queuing things up, now waiting for results queue to drain 8238 1726882389.46440: waiting for pending results... 8238 1726882389.46629: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8238 1726882389.46723: in run() - task 0affc7ec-ae25-54bc-d334-000000000030 8238 1726882389.46736: variable 'ansible_search_path' from source: unknown 8238 1726882389.46740: variable 'ansible_search_path' from source: unknown 8238 1726882389.46775: calling self._execute() 8238 1726882389.46856: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882389.46860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882389.46870: variable 'omit' from source: magic vars 8238 1726882389.47170: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.47181: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882389.47274: variable 'network_state' from source: role '' defaults 8238 1726882389.47283: Evaluated conditional (network_state != {}): False 8238 1726882389.47287: when evaluation is False, skipping this task 8238 1726882389.47290: _execute() done 8238 1726882389.47292: dumping result to json 8238 1726882389.47297: done dumping result, returning 8238 1726882389.47304: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-54bc-d334-000000000030] 8238 1726882389.47311: sending task result for task 0affc7ec-ae25-54bc-d334-000000000030 8238 1726882389.47417: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000030 8238 1726882389.47421: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882389.47476: no more pending results, returning what we have 8238 1726882389.47480: results queue empty 8238 1726882389.47481: checking for any_errors_fatal 8238 1726882389.47491: done checking for any_errors_fatal 8238 1726882389.47491: checking for max_fail_percentage 8238 1726882389.47493: done checking for max_fail_percentage 8238 1726882389.47494: checking to see if all hosts have failed and the running result is not ok 8238 1726882389.47495: done checking to see if all hosts have failed 8238 1726882389.47496: getting the remaining hosts for this loop 8238 1726882389.47497: done getting the remaining hosts for this loop 8238 1726882389.47501: getting the next task for host managed_node3 8238 1726882389.47508: done getting next task for host managed_node3 8238 1726882389.47512: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8238 1726882389.47515: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882389.47532: getting variables 8238 1726882389.47535: in VariableManager get_vars() 8238 1726882389.47573: Calling all_inventory to load vars for managed_node3 8238 1726882389.47576: Calling groups_inventory to load vars for managed_node3 8238 1726882389.47578: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882389.47587: Calling all_plugins_play to load vars for managed_node3 8238 1726882389.47589: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882389.47591: Calling groups_plugins_play to load vars for managed_node3 8238 1726882389.48561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882389.49716: done with get_vars() 8238 1726882389.49741: done getting variables 8238 1726882389.49828: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:09 -0400 (0:00:00.037) 0:00:19.653 ****** 8238 1726882389.49856: entering _queue_task() for managed_node3/service 8238 1726882389.49858: Creating lock for service 8238 1726882389.50130: worker is 1 (out of 1 available) 8238 1726882389.50147: exiting _queue_task() for managed_node3/service 8238 1726882389.50163: done queuing things up, now waiting for results queue to drain 8238 1726882389.50165: waiting for pending results... 8238 1726882389.50353: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8238 1726882389.50445: in run() - task 0affc7ec-ae25-54bc-d334-000000000031 8238 1726882389.50458: variable 'ansible_search_path' from source: unknown 8238 1726882389.50462: variable 'ansible_search_path' from source: unknown 8238 1726882389.50499: calling self._execute() 8238 1726882389.50631: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882389.50636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882389.50638: variable 'omit' from source: magic vars 8238 1726882389.50904: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.50914: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882389.51007: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882389.51207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882389.53064: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882389.53113: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882389.53232: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882389.53236: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882389.53239: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882389.53279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.53304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.53323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.53355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.53365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.53404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.53425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.53445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.53473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.53486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.53519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.53539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.53560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.53586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.53598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.53726: variable 'network_connections' from source: task vars 8238 1726882389.53737: variable 'controller_profile' from source: play vars 8238 1726882389.53794: variable 'controller_profile' from source: play vars 8238 1726882389.53802: variable 'controller_device' from source: play vars 8238 1726882389.53854: variable 'controller_device' from source: play vars 8238 1726882389.53858: variable 'port1_profile' from source: play vars 8238 1726882389.53908: variable 'port1_profile' from source: play vars 8238 1726882389.53915: variable 'dhcp_interface1' from source: play vars 8238 1726882389.53964: variable 'dhcp_interface1' from source: play vars 8238 1726882389.53968: variable 'controller_profile' from source: play vars 8238 1726882389.54016: variable 'controller_profile' from source: play vars 8238 1726882389.54024: variable 'port2_profile' from source: play vars 8238 1726882389.54069: variable 'port2_profile' from source: play vars 8238 1726882389.54075: variable 'dhcp_interface2' from source: play vars 8238 1726882389.54121: variable 'dhcp_interface2' from source: play vars 8238 1726882389.54128: variable 'controller_profile' from source: play vars 8238 1726882389.54174: variable 'controller_profile' from source: play vars 8238 1726882389.54230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882389.54359: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882389.54392: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882389.54418: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882389.54443: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882389.54478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882389.54494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882389.54514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.54537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882389.54589: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882389.54757: variable 'network_connections' from source: task vars 8238 1726882389.54763: variable 'controller_profile' from source: play vars 8238 1726882389.54810: variable 'controller_profile' from source: play vars 8238 1726882389.54816: variable 'controller_device' from source: play vars 8238 1726882389.54865: variable 'controller_device' from source: play vars 8238 1726882389.54873: variable 'port1_profile' from source: play vars 8238 1726882389.54917: variable 'port1_profile' from source: play vars 8238 1726882389.54925: variable 'dhcp_interface1' from source: play vars 8238 1726882389.54971: variable 'dhcp_interface1' from source: play vars 8238 1726882389.54977: variable 'controller_profile' from source: play vars 8238 1726882389.55024: variable 'controller_profile' from source: play vars 8238 1726882389.55030: variable 'port2_profile' from source: play vars 8238 1726882389.55078: variable 'port2_profile' from source: play vars 8238 1726882389.55082: variable 'dhcp_interface2' from source: play vars 8238 1726882389.55128: variable 'dhcp_interface2' from source: play vars 8238 1726882389.55134: variable 'controller_profile' from source: play vars 8238 1726882389.55179: variable 'controller_profile' from source: play vars 8238 1726882389.55207: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8238 1726882389.55210: when evaluation is False, skipping this task 8238 1726882389.55213: _execute() done 8238 1726882389.55216: dumping result to json 8238 1726882389.55220: done dumping result, returning 8238 1726882389.55230: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-54bc-d334-000000000031] 8238 1726882389.55235: sending task result for task 0affc7ec-ae25-54bc-d334-000000000031 8238 1726882389.55329: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000031 8238 1726882389.55333: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8238 1726882389.55384: no more pending results, returning what we have 8238 1726882389.55388: results queue empty 8238 1726882389.55389: checking for any_errors_fatal 8238 1726882389.55396: done checking for any_errors_fatal 8238 1726882389.55396: checking for max_fail_percentage 8238 1726882389.55398: done checking for max_fail_percentage 8238 1726882389.55399: checking to see if all hosts have failed and the running result is not ok 8238 1726882389.55400: done checking to see if all hosts have failed 8238 1726882389.55400: getting the remaining hosts for this loop 8238 1726882389.55402: done getting the remaining hosts for this loop 8238 1726882389.55406: getting the next task for host managed_node3 8238 1726882389.55413: done getting next task for host managed_node3 8238 1726882389.55417: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8238 1726882389.55420: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882389.55437: getting variables 8238 1726882389.55438: in VariableManager get_vars() 8238 1726882389.55485: Calling all_inventory to load vars for managed_node3 8238 1726882389.55488: Calling groups_inventory to load vars for managed_node3 8238 1726882389.55490: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882389.55500: Calling all_plugins_play to load vars for managed_node3 8238 1726882389.55503: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882389.55505: Calling groups_plugins_play to load vars for managed_node3 8238 1726882389.56647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882389.57827: done with get_vars() 8238 1726882389.57852: done getting variables 8238 1726882389.57900: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:09 -0400 (0:00:00.080) 0:00:19.734 ****** 8238 1726882389.57928: entering _queue_task() for managed_node3/service 8238 1726882389.58201: worker is 1 (out of 1 available) 8238 1726882389.58215: exiting _queue_task() for managed_node3/service 8238 1726882389.58231: done queuing things up, now waiting for results queue to drain 8238 1726882389.58233: waiting for pending results... 8238 1726882389.58844: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8238 1726882389.59004: in run() - task 0affc7ec-ae25-54bc-d334-000000000032 8238 1726882389.59013: variable 'ansible_search_path' from source: unknown 8238 1726882389.59112: variable 'ansible_search_path' from source: unknown 8238 1726882389.59117: calling self._execute() 8238 1726882389.59189: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882389.59201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882389.59228: variable 'omit' from source: magic vars 8238 1726882389.59665: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.59675: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882389.59801: variable 'network_provider' from source: set_fact 8238 1726882389.59805: variable 'network_state' from source: role '' defaults 8238 1726882389.59814: Evaluated conditional (network_provider == "nm" or network_state != {}): True 8238 1726882389.59820: variable 'omit' from source: magic vars 8238 1726882389.59869: variable 'omit' from source: magic vars 8238 1726882389.59895: variable 'network_service_name' from source: role '' defaults 8238 1726882389.59948: variable 'network_service_name' from source: role '' defaults 8238 1726882389.60030: variable '__network_provider_setup' from source: role '' defaults 8238 1726882389.60035: variable '__network_service_name_default_nm' from source: role '' defaults 8238 1726882389.60085: variable '__network_service_name_default_nm' from source: role '' defaults 8238 1726882389.60096: variable '__network_packages_default_nm' from source: role '' defaults 8238 1726882389.60145: variable '__network_packages_default_nm' from source: role '' defaults 8238 1726882389.60316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882389.62629: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882389.62634: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882389.62661: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882389.62705: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882389.62742: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882389.62836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.62876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.62905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.62956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.62977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.63064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.63173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.63664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.63667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.63670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.63830: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8238 1726882389.64263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.64294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.64328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.64476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.64495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.64706: variable 'ansible_python' from source: facts 8238 1726882389.64737: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8238 1726882389.64921: variable '__network_wpa_supplicant_required' from source: role '' defaults 8238 1726882389.65116: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8238 1726882389.65734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.65738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.65740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.65743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.65745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.66005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882389.66048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882389.66259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.66447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882389.66454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882389.66657: variable 'network_connections' from source: task vars 8238 1726882389.66736: variable 'controller_profile' from source: play vars 8238 1726882389.66916: variable 'controller_profile' from source: play vars 8238 1726882389.66959: variable 'controller_device' from source: play vars 8238 1726882389.67079: variable 'controller_device' from source: play vars 8238 1726882389.67255: variable 'port1_profile' from source: play vars 8238 1726882389.67378: variable 'port1_profile' from source: play vars 8238 1726882389.67476: variable 'dhcp_interface1' from source: play vars 8238 1726882389.67680: variable 'dhcp_interface1' from source: play vars 8238 1726882389.67700: variable 'controller_profile' from source: play vars 8238 1726882389.67899: variable 'controller_profile' from source: play vars 8238 1726882389.67917: variable 'port2_profile' from source: play vars 8238 1726882389.68097: variable 'port2_profile' from source: play vars 8238 1726882389.68118: variable 'dhcp_interface2' from source: play vars 8238 1726882389.68203: variable 'dhcp_interface2' from source: play vars 8238 1726882389.68427: variable 'controller_profile' from source: play vars 8238 1726882389.68431: variable 'controller_profile' from source: play vars 8238 1726882389.68645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882389.69296: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882389.69576: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882389.69674: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882389.69748: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882389.69830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882389.69876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882389.69918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882389.69973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882389.70038: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882389.70389: variable 'network_connections' from source: task vars 8238 1726882389.70405: variable 'controller_profile' from source: play vars 8238 1726882389.70501: variable 'controller_profile' from source: play vars 8238 1726882389.70518: variable 'controller_device' from source: play vars 8238 1726882389.70605: variable 'controller_device' from source: play vars 8238 1726882389.70721: variable 'port1_profile' from source: play vars 8238 1726882389.70728: variable 'port1_profile' from source: play vars 8238 1726882389.70730: variable 'dhcp_interface1' from source: play vars 8238 1726882389.70802: variable 'dhcp_interface1' from source: play vars 8238 1726882389.71128: variable 'controller_profile' from source: play vars 8238 1726882389.71131: variable 'controller_profile' from source: play vars 8238 1726882389.71134: variable 'port2_profile' from source: play vars 8238 1726882389.71527: variable 'port2_profile' from source: play vars 8238 1726882389.71532: variable 'dhcp_interface2' from source: play vars 8238 1726882389.71535: variable 'dhcp_interface2' from source: play vars 8238 1726882389.71537: variable 'controller_profile' from source: play vars 8238 1726882389.72127: variable 'controller_profile' from source: play vars 8238 1726882389.72130: variable '__network_packages_default_wireless' from source: role '' defaults 8238 1726882389.72133: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882389.72962: variable 'network_connections' from source: task vars 8238 1726882389.72977: variable 'controller_profile' from source: play vars 8238 1726882389.73114: variable 'controller_profile' from source: play vars 8238 1726882389.73132: variable 'controller_device' from source: play vars 8238 1726882389.73264: variable 'controller_device' from source: play vars 8238 1726882389.73299: variable 'port1_profile' from source: play vars 8238 1726882389.73383: variable 'port1_profile' from source: play vars 8238 1726882389.73406: variable 'dhcp_interface1' from source: play vars 8238 1726882389.73487: variable 'dhcp_interface1' from source: play vars 8238 1726882389.73505: variable 'controller_profile' from source: play vars 8238 1726882389.73589: variable 'controller_profile' from source: play vars 8238 1726882389.73602: variable 'port2_profile' from source: play vars 8238 1726882389.73691: variable 'port2_profile' from source: play vars 8238 1726882389.73722: variable 'dhcp_interface2' from source: play vars 8238 1726882389.73795: variable 'dhcp_interface2' from source: play vars 8238 1726882389.73832: variable 'controller_profile' from source: play vars 8238 1726882389.73899: variable 'controller_profile' from source: play vars 8238 1726882389.73941: variable '__network_packages_default_team' from source: role '' defaults 8238 1726882389.74049: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882389.74660: variable 'network_connections' from source: task vars 8238 1726882389.74673: variable 'controller_profile' from source: play vars 8238 1726882389.74760: variable 'controller_profile' from source: play vars 8238 1726882389.74773: variable 'controller_device' from source: play vars 8238 1726882389.74867: variable 'controller_device' from source: play vars 8238 1726882389.74886: variable 'port1_profile' from source: play vars 8238 1726882389.74976: variable 'port1_profile' from source: play vars 8238 1726882389.74988: variable 'dhcp_interface1' from source: play vars 8238 1726882389.75074: variable 'dhcp_interface1' from source: play vars 8238 1726882389.75086: variable 'controller_profile' from source: play vars 8238 1726882389.75177: variable 'controller_profile' from source: play vars 8238 1726882389.75190: variable 'port2_profile' from source: play vars 8238 1726882389.75330: variable 'port2_profile' from source: play vars 8238 1726882389.75333: variable 'dhcp_interface2' from source: play vars 8238 1726882389.75375: variable 'dhcp_interface2' from source: play vars 8238 1726882389.75387: variable 'controller_profile' from source: play vars 8238 1726882389.75473: variable 'controller_profile' from source: play vars 8238 1726882389.75553: variable '__network_service_name_default_initscripts' from source: role '' defaults 8238 1726882389.75631: variable '__network_service_name_default_initscripts' from source: role '' defaults 8238 1726882389.75644: variable '__network_packages_default_initscripts' from source: role '' defaults 8238 1726882389.75763: variable '__network_packages_default_initscripts' from source: role '' defaults 8238 1726882389.75997: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8238 1726882389.77140: variable 'network_connections' from source: task vars 8238 1726882389.77143: variable 'controller_profile' from source: play vars 8238 1726882389.77146: variable 'controller_profile' from source: play vars 8238 1726882389.77148: variable 'controller_device' from source: play vars 8238 1726882389.77359: variable 'controller_device' from source: play vars 8238 1726882389.77374: variable 'port1_profile' from source: play vars 8238 1726882389.77442: variable 'port1_profile' from source: play vars 8238 1726882389.77538: variable 'dhcp_interface1' from source: play vars 8238 1726882389.77728: variable 'dhcp_interface1' from source: play vars 8238 1726882389.77792: variable 'controller_profile' from source: play vars 8238 1726882389.77809: variable 'controller_profile' from source: play vars 8238 1726882389.78010: variable 'port2_profile' from source: play vars 8238 1726882389.78013: variable 'port2_profile' from source: play vars 8238 1726882389.78015: variable 'dhcp_interface2' from source: play vars 8238 1726882389.78162: variable 'dhcp_interface2' from source: play vars 8238 1726882389.78174: variable 'controller_profile' from source: play vars 8238 1726882389.78359: variable 'controller_profile' from source: play vars 8238 1726882389.78374: variable 'ansible_distribution' from source: facts 8238 1726882389.78382: variable '__network_rh_distros' from source: role '' defaults 8238 1726882389.78392: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.78427: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8238 1726882389.78886: variable 'ansible_distribution' from source: facts 8238 1726882389.78896: variable '__network_rh_distros' from source: role '' defaults 8238 1726882389.78906: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.78918: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8238 1726882389.79341: variable 'ansible_distribution' from source: facts 8238 1726882389.79352: variable '__network_rh_distros' from source: role '' defaults 8238 1726882389.79362: variable 'ansible_distribution_major_version' from source: facts 8238 1726882389.79403: variable 'network_provider' from source: set_fact 8238 1726882389.79455: variable 'omit' from source: magic vars 8238 1726882389.79572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882389.79606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882389.79669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882389.79827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882389.79830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882389.79833: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882389.79835: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882389.79837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882389.80190: Set connection var ansible_connection to ssh 8238 1726882389.80193: Set connection var ansible_shell_type to sh 8238 1726882389.80195: Set connection var ansible_pipelining to False 8238 1726882389.80197: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882389.80200: Set connection var ansible_timeout to 10 8238 1726882389.80202: Set connection var ansible_shell_executable to /bin/sh 8238 1726882389.80203: variable 'ansible_shell_executable' from source: unknown 8238 1726882389.80205: variable 'ansible_connection' from source: unknown 8238 1726882389.80207: variable 'ansible_module_compression' from source: unknown 8238 1726882389.80209: variable 'ansible_shell_type' from source: unknown 8238 1726882389.80211: variable 'ansible_shell_executable' from source: unknown 8238 1726882389.80408: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882389.80411: variable 'ansible_pipelining' from source: unknown 8238 1726882389.80413: variable 'ansible_timeout' from source: unknown 8238 1726882389.80415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882389.80545: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882389.80566: variable 'omit' from source: magic vars 8238 1726882389.80575: starting attempt loop 8238 1726882389.80582: running the handler 8238 1726882389.80930: variable 'ansible_facts' from source: unknown 8238 1726882389.82814: _low_level_execute_command(): starting 8238 1726882389.82832: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882389.84106: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882389.84174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882389.84203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882389.84460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882389.84562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882389.86325: stdout chunk (state=3): >>>/root <<< 8238 1726882389.86498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882389.86511: stdout chunk (state=3): >>><<< 8238 1726882389.86530: stderr chunk (state=3): >>><<< 8238 1726882389.86557: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882389.86828: _low_level_execute_command(): starting 8238 1726882389.86832: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886 `" && echo ansible-tmp-1726882389.8672924-9002-146094592949886="` echo /root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886 `" ) && sleep 0' 8238 1726882389.87888: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882389.87902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882389.87914: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882389.88237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882389.88287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882389.90266: stdout chunk (state=3): >>>ansible-tmp-1726882389.8672924-9002-146094592949886=/root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886 <<< 8238 1726882389.90454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882389.90528: stderr chunk (state=3): >>><<< 8238 1726882389.90538: stdout chunk (state=3): >>><<< 8238 1726882389.90572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882389.8672924-9002-146094592949886=/root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882389.90643: variable 'ansible_module_compression' from source: unknown 8238 1726882389.90777: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 8238 1726882389.90814: ANSIBALLZ: Acquiring lock 8238 1726882389.90845: ANSIBALLZ: Lock acquired: 140036204254016 8238 1726882389.90861: ANSIBALLZ: Creating module 8238 1726882390.55499: ANSIBALLZ: Writing module into payload 8238 1726882390.55758: ANSIBALLZ: Writing module 8238 1726882390.55763: ANSIBALLZ: Renaming module 8238 1726882390.55765: ANSIBALLZ: Done creating module 8238 1726882390.55867: variable 'ansible_facts' from source: unknown 8238 1726882390.56049: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/AnsiballZ_systemd.py 8238 1726882390.56320: Sending initial data 8238 1726882390.56327: Sent initial data (154 bytes) 8238 1726882390.57038: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882390.57076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882390.57095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882390.57107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882390.57225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882390.59132: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882390.59266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882390.59297: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp1dybeplj /root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/AnsiballZ_systemd.py <<< 8238 1726882390.59319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/AnsiballZ_systemd.py" <<< 8238 1726882390.59401: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp1dybeplj" to remote "/root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/AnsiballZ_systemd.py" <<< 8238 1726882390.61947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882390.62068: stderr chunk (state=3): >>><<< 8238 1726882390.62087: stdout chunk (state=3): >>><<< 8238 1726882390.62118: done transferring module to remote 8238 1726882390.62140: _low_level_execute_command(): starting 8238 1726882390.62151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/ /root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/AnsiballZ_systemd.py && sleep 0' 8238 1726882390.63017: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882390.63042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882390.63062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882390.63093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882390.63113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882390.63137: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882390.63207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882390.63257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882390.63283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882390.63332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882390.63435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882390.65282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882390.65371: stderr chunk (state=3): >>><<< 8238 1726882390.65384: stdout chunk (state=3): >>><<< 8238 1726882390.65411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882390.65427: _low_level_execute_command(): starting 8238 1726882390.65439: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/AnsiballZ_systemd.py && sleep 0' 8238 1726882390.66098: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882390.66113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882390.66133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882390.66154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882390.66175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882390.66302: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882390.66307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882390.66330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882390.66455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882390.98083: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11603968", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3494563840", "CPUUsageNSec": "694063000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCOR<<< 8238 1726882390.98141: stdout chunk (state=3): >>>E": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8238 1726882390.99926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882391.00048: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 8238 1726882391.00054: stdout chunk (state=3): >>><<< 8238 1726882391.00057: stderr chunk (state=3): >>><<< 8238 1726882391.00230: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11603968", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3494563840", "CPUUsageNSec": "694063000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882391.00308: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882391.00337: _low_level_execute_command(): starting 8238 1726882391.00358: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882389.8672924-9002-146094592949886/ > /dev/null 2>&1 && sleep 0' 8238 1726882391.01136: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882391.01201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882391.01220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882391.01265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882391.01378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882391.03360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882391.03364: stdout chunk (state=3): >>><<< 8238 1726882391.03527: stderr chunk (state=3): >>><<< 8238 1726882391.03531: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882391.03534: handler run complete 8238 1726882391.03536: attempt loop complete, returning result 8238 1726882391.03538: _execute() done 8238 1726882391.03541: dumping result to json 8238 1726882391.03543: done dumping result, returning 8238 1726882391.03545: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-54bc-d334-000000000032] 8238 1726882391.03547: sending task result for task 0affc7ec-ae25-54bc-d334-000000000032 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882391.04080: no more pending results, returning what we have 8238 1726882391.04084: results queue empty 8238 1726882391.04085: checking for any_errors_fatal 8238 1726882391.04092: done checking for any_errors_fatal 8238 1726882391.04093: checking for max_fail_percentage 8238 1726882391.04100: done checking for max_fail_percentage 8238 1726882391.04101: checking to see if all hosts have failed and the running result is not ok 8238 1726882391.04102: done checking to see if all hosts have failed 8238 1726882391.04103: getting the remaining hosts for this loop 8238 1726882391.04104: done getting the remaining hosts for this loop 8238 1726882391.04109: getting the next task for host managed_node3 8238 1726882391.04116: done getting next task for host managed_node3 8238 1726882391.04120: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8238 1726882391.04125: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882391.04220: getting variables 8238 1726882391.04224: in VariableManager get_vars() 8238 1726882391.04272: Calling all_inventory to load vars for managed_node3 8238 1726882391.04275: Calling groups_inventory to load vars for managed_node3 8238 1726882391.04278: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882391.04326: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000032 8238 1726882391.04331: WORKER PROCESS EXITING 8238 1726882391.04342: Calling all_plugins_play to load vars for managed_node3 8238 1726882391.04345: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882391.04349: Calling groups_plugins_play to load vars for managed_node3 8238 1726882391.06504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882391.08790: done with get_vars() 8238 1726882391.08828: done getting variables 8238 1726882391.08958: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:11 -0400 (0:00:01.511) 0:00:21.245 ****** 8238 1726882391.09048: entering _queue_task() for managed_node3/service 8238 1726882391.09337: worker is 1 (out of 1 available) 8238 1726882391.09351: exiting _queue_task() for managed_node3/service 8238 1726882391.09364: done queuing things up, now waiting for results queue to drain 8238 1726882391.09366: waiting for pending results... 8238 1726882391.09558: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8238 1726882391.09661: in run() - task 0affc7ec-ae25-54bc-d334-000000000033 8238 1726882391.09674: variable 'ansible_search_path' from source: unknown 8238 1726882391.09678: variable 'ansible_search_path' from source: unknown 8238 1726882391.09709: calling self._execute() 8238 1726882391.09783: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882391.09787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882391.09796: variable 'omit' from source: magic vars 8238 1726882391.10101: variable 'ansible_distribution_major_version' from source: facts 8238 1726882391.10111: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882391.10201: variable 'network_provider' from source: set_fact 8238 1726882391.10205: Evaluated conditional (network_provider == "nm"): True 8238 1726882391.10276: variable '__network_wpa_supplicant_required' from source: role '' defaults 8238 1726882391.10341: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8238 1726882391.10475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882391.12483: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882391.12532: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882391.12564: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882391.12591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882391.12613: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882391.12691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882391.12713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882391.12733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882391.12770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882391.12780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882391.12817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882391.12836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882391.12859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882391.12888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882391.12899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882391.12932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882391.12949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882391.12972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882391.13000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882391.13011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882391.13115: variable 'network_connections' from source: task vars 8238 1726882391.13125: variable 'controller_profile' from source: play vars 8238 1726882391.13176: variable 'controller_profile' from source: play vars 8238 1726882391.13186: variable 'controller_device' from source: play vars 8238 1726882391.13233: variable 'controller_device' from source: play vars 8238 1726882391.13241: variable 'port1_profile' from source: play vars 8238 1726882391.13288: variable 'port1_profile' from source: play vars 8238 1726882391.13292: variable 'dhcp_interface1' from source: play vars 8238 1726882391.13340: variable 'dhcp_interface1' from source: play vars 8238 1726882391.13346: variable 'controller_profile' from source: play vars 8238 1726882391.13391: variable 'controller_profile' from source: play vars 8238 1726882391.13397: variable 'port2_profile' from source: play vars 8238 1726882391.13446: variable 'port2_profile' from source: play vars 8238 1726882391.13452: variable 'dhcp_interface2' from source: play vars 8238 1726882391.13497: variable 'dhcp_interface2' from source: play vars 8238 1726882391.13503: variable 'controller_profile' from source: play vars 8238 1726882391.13551: variable 'controller_profile' from source: play vars 8238 1726882391.13605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882391.13720: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882391.13753: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882391.13778: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882391.13800: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882391.13835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882391.13853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882391.13874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882391.13896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882391.13937: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882391.14117: variable 'network_connections' from source: task vars 8238 1726882391.14120: variable 'controller_profile' from source: play vars 8238 1726882391.14193: variable 'controller_profile' from source: play vars 8238 1726882391.14196: variable 'controller_device' from source: play vars 8238 1726882391.14233: variable 'controller_device' from source: play vars 8238 1726882391.14240: variable 'port1_profile' from source: play vars 8238 1726882391.14310: variable 'port1_profile' from source: play vars 8238 1726882391.14529: variable 'dhcp_interface1' from source: play vars 8238 1726882391.14532: variable 'dhcp_interface1' from source: play vars 8238 1726882391.14534: variable 'controller_profile' from source: play vars 8238 1726882391.14537: variable 'controller_profile' from source: play vars 8238 1726882391.14539: variable 'port2_profile' from source: play vars 8238 1726882391.14541: variable 'port2_profile' from source: play vars 8238 1726882391.14546: variable 'dhcp_interface2' from source: play vars 8238 1726882391.14614: variable 'dhcp_interface2' from source: play vars 8238 1726882391.14628: variable 'controller_profile' from source: play vars 8238 1726882391.14694: variable 'controller_profile' from source: play vars 8238 1726882391.14739: Evaluated conditional (__network_wpa_supplicant_required): False 8238 1726882391.14747: when evaluation is False, skipping this task 8238 1726882391.14758: _execute() done 8238 1726882391.14767: dumping result to json 8238 1726882391.14776: done dumping result, returning 8238 1726882391.14787: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-54bc-d334-000000000033] 8238 1726882391.14798: sending task result for task 0affc7ec-ae25-54bc-d334-000000000033 8238 1726882391.14917: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000033 8238 1726882391.14928: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 8238 1726882391.14981: no more pending results, returning what we have 8238 1726882391.14985: results queue empty 8238 1726882391.14986: checking for any_errors_fatal 8238 1726882391.15012: done checking for any_errors_fatal 8238 1726882391.15012: checking for max_fail_percentage 8238 1726882391.15013: done checking for max_fail_percentage 8238 1726882391.15014: checking to see if all hosts have failed and the running result is not ok 8238 1726882391.15015: done checking to see if all hosts have failed 8238 1726882391.15016: getting the remaining hosts for this loop 8238 1726882391.15017: done getting the remaining hosts for this loop 8238 1726882391.15021: getting the next task for host managed_node3 8238 1726882391.15031: done getting next task for host managed_node3 8238 1726882391.15035: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 8238 1726882391.15038: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882391.15053: getting variables 8238 1726882391.15055: in VariableManager get_vars() 8238 1726882391.15095: Calling all_inventory to load vars for managed_node3 8238 1726882391.15097: Calling groups_inventory to load vars for managed_node3 8238 1726882391.15099: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882391.15109: Calling all_plugins_play to load vars for managed_node3 8238 1726882391.15111: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882391.15114: Calling groups_plugins_play to load vars for managed_node3 8238 1726882391.16562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882391.17775: done with get_vars() 8238 1726882391.17792: done getting variables 8238 1726882391.17863: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:11 -0400 (0:00:00.088) 0:00:21.334 ****** 8238 1726882391.17897: entering _queue_task() for managed_node3/service 8238 1726882391.18238: worker is 1 (out of 1 available) 8238 1726882391.18253: exiting _queue_task() for managed_node3/service 8238 1726882391.18265: done queuing things up, now waiting for results queue to drain 8238 1726882391.18267: waiting for pending results... 8238 1726882391.18568: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 8238 1726882391.18706: in run() - task 0affc7ec-ae25-54bc-d334-000000000034 8238 1726882391.18740: variable 'ansible_search_path' from source: unknown 8238 1726882391.18755: variable 'ansible_search_path' from source: unknown 8238 1726882391.18800: calling self._execute() 8238 1726882391.18909: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882391.18968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882391.18972: variable 'omit' from source: magic vars 8238 1726882391.19330: variable 'ansible_distribution_major_version' from source: facts 8238 1726882391.19348: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882391.19470: variable 'network_provider' from source: set_fact 8238 1726882391.19474: Evaluated conditional (network_provider == "initscripts"): False 8238 1726882391.19477: when evaluation is False, skipping this task 8238 1726882391.19480: _execute() done 8238 1726882391.19485: dumping result to json 8238 1726882391.19487: done dumping result, returning 8238 1726882391.19516: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-54bc-d334-000000000034] 8238 1726882391.19520: sending task result for task 0affc7ec-ae25-54bc-d334-000000000034 8238 1726882391.19600: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000034 8238 1726882391.19603: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882391.19662: no more pending results, returning what we have 8238 1726882391.19666: results queue empty 8238 1726882391.19667: checking for any_errors_fatal 8238 1726882391.19675: done checking for any_errors_fatal 8238 1726882391.19675: checking for max_fail_percentage 8238 1726882391.19677: done checking for max_fail_percentage 8238 1726882391.19678: checking to see if all hosts have failed and the running result is not ok 8238 1726882391.19678: done checking to see if all hosts have failed 8238 1726882391.19679: getting the remaining hosts for this loop 8238 1726882391.19680: done getting the remaining hosts for this loop 8238 1726882391.19684: getting the next task for host managed_node3 8238 1726882391.19690: done getting next task for host managed_node3 8238 1726882391.19694: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8238 1726882391.19697: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882391.19711: getting variables 8238 1726882391.19712: in VariableManager get_vars() 8238 1726882391.19748: Calling all_inventory to load vars for managed_node3 8238 1726882391.19753: Calling groups_inventory to load vars for managed_node3 8238 1726882391.19755: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882391.19763: Calling all_plugins_play to load vars for managed_node3 8238 1726882391.19766: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882391.19769: Calling groups_plugins_play to load vars for managed_node3 8238 1726882391.21151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882391.22284: done with get_vars() 8238 1726882391.22302: done getting variables 8238 1726882391.22347: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:11 -0400 (0:00:00.044) 0:00:21.378 ****** 8238 1726882391.22372: entering _queue_task() for managed_node3/copy 8238 1726882391.22578: worker is 1 (out of 1 available) 8238 1726882391.22592: exiting _queue_task() for managed_node3/copy 8238 1726882391.22603: done queuing things up, now waiting for results queue to drain 8238 1726882391.22605: waiting for pending results... 8238 1726882391.22792: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8238 1726882391.22924: in run() - task 0affc7ec-ae25-54bc-d334-000000000035 8238 1726882391.22957: variable 'ansible_search_path' from source: unknown 8238 1726882391.22961: variable 'ansible_search_path' from source: unknown 8238 1726882391.22992: calling self._execute() 8238 1726882391.23079: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882391.23094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882391.23099: variable 'omit' from source: magic vars 8238 1726882391.23660: variable 'ansible_distribution_major_version' from source: facts 8238 1726882391.23664: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882391.23667: variable 'network_provider' from source: set_fact 8238 1726882391.23669: Evaluated conditional (network_provider == "initscripts"): False 8238 1726882391.23672: when evaluation is False, skipping this task 8238 1726882391.23674: _execute() done 8238 1726882391.23683: dumping result to json 8238 1726882391.23685: done dumping result, returning 8238 1726882391.23689: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-54bc-d334-000000000035] 8238 1726882391.23692: sending task result for task 0affc7ec-ae25-54bc-d334-000000000035 8238 1726882391.23803: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000035 8238 1726882391.23807: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8238 1726882391.23860: no more pending results, returning what we have 8238 1726882391.23864: results queue empty 8238 1726882391.23865: checking for any_errors_fatal 8238 1726882391.23871: done checking for any_errors_fatal 8238 1726882391.23872: checking for max_fail_percentage 8238 1726882391.23874: done checking for max_fail_percentage 8238 1726882391.23875: checking to see if all hosts have failed and the running result is not ok 8238 1726882391.23876: done checking to see if all hosts have failed 8238 1726882391.23877: getting the remaining hosts for this loop 8238 1726882391.23878: done getting the remaining hosts for this loop 8238 1726882391.23882: getting the next task for host managed_node3 8238 1726882391.23897: done getting next task for host managed_node3 8238 1726882391.23902: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8238 1726882391.23905: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882391.23921: getting variables 8238 1726882391.23925: in VariableManager get_vars() 8238 1726882391.23967: Calling all_inventory to load vars for managed_node3 8238 1726882391.23970: Calling groups_inventory to load vars for managed_node3 8238 1726882391.23973: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882391.23995: Calling all_plugins_play to load vars for managed_node3 8238 1726882391.24006: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882391.24010: Calling groups_plugins_play to load vars for managed_node3 8238 1726882391.25128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882391.26252: done with get_vars() 8238 1726882391.26269: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:11 -0400 (0:00:00.039) 0:00:21.418 ****** 8238 1726882391.26335: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 8238 1726882391.26336: Creating lock for fedora.linux_system_roles.network_connections 8238 1726882391.26560: worker is 1 (out of 1 available) 8238 1726882391.26575: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 8238 1726882391.26588: done queuing things up, now waiting for results queue to drain 8238 1726882391.26590: waiting for pending results... 8238 1726882391.26767: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8238 1726882391.26857: in run() - task 0affc7ec-ae25-54bc-d334-000000000036 8238 1726882391.26870: variable 'ansible_search_path' from source: unknown 8238 1726882391.26874: variable 'ansible_search_path' from source: unknown 8238 1726882391.26903: calling self._execute() 8238 1726882391.26981: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882391.26985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882391.26994: variable 'omit' from source: magic vars 8238 1726882391.27289: variable 'ansible_distribution_major_version' from source: facts 8238 1726882391.27299: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882391.27305: variable 'omit' from source: magic vars 8238 1726882391.27346: variable 'omit' from source: magic vars 8238 1726882391.27472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882391.29015: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882391.29065: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882391.29092: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882391.29120: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882391.29145: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882391.29203: variable 'network_provider' from source: set_fact 8238 1726882391.29304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882391.29339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882391.29362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882391.29390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882391.29401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882391.29462: variable 'omit' from source: magic vars 8238 1726882391.29543: variable 'omit' from source: magic vars 8238 1726882391.29620: variable 'network_connections' from source: task vars 8238 1726882391.29631: variable 'controller_profile' from source: play vars 8238 1726882391.29682: variable 'controller_profile' from source: play vars 8238 1726882391.29689: variable 'controller_device' from source: play vars 8238 1726882391.29735: variable 'controller_device' from source: play vars 8238 1726882391.29744: variable 'port1_profile' from source: play vars 8238 1726882391.29793: variable 'port1_profile' from source: play vars 8238 1726882391.29800: variable 'dhcp_interface1' from source: play vars 8238 1726882391.29846: variable 'dhcp_interface1' from source: play vars 8238 1726882391.29852: variable 'controller_profile' from source: play vars 8238 1726882391.29900: variable 'controller_profile' from source: play vars 8238 1726882391.29907: variable 'port2_profile' from source: play vars 8238 1726882391.29952: variable 'port2_profile' from source: play vars 8238 1726882391.29960: variable 'dhcp_interface2' from source: play vars 8238 1726882391.30008: variable 'dhcp_interface2' from source: play vars 8238 1726882391.30014: variable 'controller_profile' from source: play vars 8238 1726882391.30061: variable 'controller_profile' from source: play vars 8238 1726882391.30191: variable 'omit' from source: magic vars 8238 1726882391.30200: variable '__lsr_ansible_managed' from source: task vars 8238 1726882391.30247: variable '__lsr_ansible_managed' from source: task vars 8238 1726882391.30379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8238 1726882391.30532: Loaded config def from plugin (lookup/template) 8238 1726882391.30535: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 8238 1726882391.30561: File lookup term: get_ansible_managed.j2 8238 1726882391.30565: variable 'ansible_search_path' from source: unknown 8238 1726882391.30569: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 8238 1726882391.30581: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 8238 1726882391.30595: variable 'ansible_search_path' from source: unknown 8238 1726882391.34706: variable 'ansible_managed' from source: unknown 8238 1726882391.34802: variable 'omit' from source: magic vars 8238 1726882391.34826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882391.34846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882391.34864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882391.34880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882391.34888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882391.34911: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882391.34914: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882391.34917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882391.34989: Set connection var ansible_connection to ssh 8238 1726882391.34992: Set connection var ansible_shell_type to sh 8238 1726882391.34995: Set connection var ansible_pipelining to False 8238 1726882391.35002: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882391.35012: Set connection var ansible_timeout to 10 8238 1726882391.35015: Set connection var ansible_shell_executable to /bin/sh 8238 1726882391.35034: variable 'ansible_shell_executable' from source: unknown 8238 1726882391.35037: variable 'ansible_connection' from source: unknown 8238 1726882391.35039: variable 'ansible_module_compression' from source: unknown 8238 1726882391.35042: variable 'ansible_shell_type' from source: unknown 8238 1726882391.35044: variable 'ansible_shell_executable' from source: unknown 8238 1726882391.35047: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882391.35051: variable 'ansible_pipelining' from source: unknown 8238 1726882391.35057: variable 'ansible_timeout' from source: unknown 8238 1726882391.35061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882391.35158: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882391.35167: variable 'omit' from source: magic vars 8238 1726882391.35174: starting attempt loop 8238 1726882391.35178: running the handler 8238 1726882391.35189: _low_level_execute_command(): starting 8238 1726882391.35197: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882391.35733: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882391.35737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882391.35740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882391.35742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882391.35744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882391.35802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882391.35806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882391.35808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882391.35900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882391.37601: stdout chunk (state=3): >>>/root <<< 8238 1726882391.37712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882391.37768: stderr chunk (state=3): >>><<< 8238 1726882391.37771: stdout chunk (state=3): >>><<< 8238 1726882391.37791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882391.37804: _low_level_execute_command(): starting 8238 1726882391.37807: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943 `" && echo ansible-tmp-1726882391.3779004-9061-116839110479943="` echo /root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943 `" ) && sleep 0' 8238 1726882391.38270: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882391.38273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882391.38276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882391.38278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882391.38327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882391.38331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882391.38348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882391.38431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882391.40381: stdout chunk (state=3): >>>ansible-tmp-1726882391.3779004-9061-116839110479943=/root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943 <<< 8238 1726882391.40499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882391.40546: stderr chunk (state=3): >>><<< 8238 1726882391.40549: stdout chunk (state=3): >>><<< 8238 1726882391.40564: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882391.3779004-9061-116839110479943=/root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882391.40603: variable 'ansible_module_compression' from source: unknown 8238 1726882391.40647: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 8238 1726882391.40653: ANSIBALLZ: Acquiring lock 8238 1726882391.40656: ANSIBALLZ: Lock acquired: 140036202546672 8238 1726882391.40659: ANSIBALLZ: Creating module 8238 1726882391.55081: ANSIBALLZ: Writing module into payload 8238 1726882391.55311: ANSIBALLZ: Writing module 8238 1726882391.55335: ANSIBALLZ: Renaming module 8238 1726882391.55340: ANSIBALLZ: Done creating module 8238 1726882391.55364: variable 'ansible_facts' from source: unknown 8238 1726882391.55426: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/AnsiballZ_network_connections.py 8238 1726882391.55534: Sending initial data 8238 1726882391.55537: Sent initial data (166 bytes) 8238 1726882391.56006: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882391.56041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882391.56044: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882391.56046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882391.56049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882391.56104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882391.56107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882391.56114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882391.56199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882391.57830: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8238 1726882391.57834: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882391.57910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882391.57996: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp_vy767gx /root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/AnsiballZ_network_connections.py <<< 8238 1726882391.57999: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/AnsiballZ_network_connections.py" <<< 8238 1726882391.58078: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp_vy767gx" to remote "/root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/AnsiballZ_network_connections.py" <<< 8238 1726882391.58084: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/AnsiballZ_network_connections.py" <<< 8238 1726882391.58985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882391.59056: stderr chunk (state=3): >>><<< 8238 1726882391.59059: stdout chunk (state=3): >>><<< 8238 1726882391.59080: done transferring module to remote 8238 1726882391.59090: _low_level_execute_command(): starting 8238 1726882391.59095: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/ /root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/AnsiballZ_network_connections.py && sleep 0' 8238 1726882391.59741: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882391.59828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882391.59844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882391.59885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882391.59999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882391.61828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882391.61873: stderr chunk (state=3): >>><<< 8238 1726882391.61876: stdout chunk (state=3): >>><<< 8238 1726882391.61890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882391.61893: _low_level_execute_command(): starting 8238 1726882391.61898: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/AnsiballZ_network_connections.py && sleep 0' 8238 1726882391.62458: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882391.62462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882391.62465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882391.62506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882391.62610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882392.18771: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 8238 1726882392.20775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882392.20838: stderr chunk (state=3): >>><<< 8238 1726882392.20841: stdout chunk (state=3): >>><<< 8238 1726882392.20862: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882392.20910: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882392.20919: _low_level_execute_command(): starting 8238 1726882392.20925: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882391.3779004-9061-116839110479943/ > /dev/null 2>&1 && sleep 0' 8238 1726882392.21406: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882392.21411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882392.21413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882392.21416: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882392.21419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882392.21458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882392.21463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882392.21480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882392.21580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882392.23530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882392.23576: stderr chunk (state=3): >>><<< 8238 1726882392.23581: stdout chunk (state=3): >>><<< 8238 1726882392.23592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882392.23598: handler run complete 8238 1726882392.23627: attempt loop complete, returning result 8238 1726882392.23631: _execute() done 8238 1726882392.23633: dumping result to json 8238 1726882392.23641: done dumping result, returning 8238 1726882392.23649: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-54bc-d334-000000000036] 8238 1726882392.23656: sending task result for task 0affc7ec-ae25-54bc-d334-000000000036 8238 1726882392.23779: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000036 8238 1726882392.23782: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d (not-active) 8238 1726882392.23940: no more pending results, returning what we have 8238 1726882392.23943: results queue empty 8238 1726882392.23944: checking for any_errors_fatal 8238 1726882392.23951: done checking for any_errors_fatal 8238 1726882392.23951: checking for max_fail_percentage 8238 1726882392.23953: done checking for max_fail_percentage 8238 1726882392.23954: checking to see if all hosts have failed and the running result is not ok 8238 1726882392.23955: done checking to see if all hosts have failed 8238 1726882392.23956: getting the remaining hosts for this loop 8238 1726882392.23957: done getting the remaining hosts for this loop 8238 1726882392.23961: getting the next task for host managed_node3 8238 1726882392.23968: done getting next task for host managed_node3 8238 1726882392.23971: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 8238 1726882392.23974: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882392.23984: getting variables 8238 1726882392.23985: in VariableManager get_vars() 8238 1726882392.24032: Calling all_inventory to load vars for managed_node3 8238 1726882392.24035: Calling groups_inventory to load vars for managed_node3 8238 1726882392.24038: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882392.24047: Calling all_plugins_play to load vars for managed_node3 8238 1726882392.24049: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882392.24052: Calling groups_plugins_play to load vars for managed_node3 8238 1726882392.25017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882392.26242: done with get_vars() 8238 1726882392.26261: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:12 -0400 (0:00:00.999) 0:00:22.418 ****** 8238 1726882392.26327: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 8238 1726882392.26328: Creating lock for fedora.linux_system_roles.network_state 8238 1726882392.26551: worker is 1 (out of 1 available) 8238 1726882392.26564: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 8238 1726882392.26576: done queuing things up, now waiting for results queue to drain 8238 1726882392.26578: waiting for pending results... 8238 1726882392.26757: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 8238 1726882392.26849: in run() - task 0affc7ec-ae25-54bc-d334-000000000037 8238 1726882392.26866: variable 'ansible_search_path' from source: unknown 8238 1726882392.26871: variable 'ansible_search_path' from source: unknown 8238 1726882392.26901: calling self._execute() 8238 1726882392.26978: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.26983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.26992: variable 'omit' from source: magic vars 8238 1726882392.27284: variable 'ansible_distribution_major_version' from source: facts 8238 1726882392.27293: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882392.27385: variable 'network_state' from source: role '' defaults 8238 1726882392.27394: Evaluated conditional (network_state != {}): False 8238 1726882392.27397: when evaluation is False, skipping this task 8238 1726882392.27400: _execute() done 8238 1726882392.27402: dumping result to json 8238 1726882392.27407: done dumping result, returning 8238 1726882392.27414: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-54bc-d334-000000000037] 8238 1726882392.27419: sending task result for task 0affc7ec-ae25-54bc-d334-000000000037 8238 1726882392.27506: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000037 8238 1726882392.27509: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882392.27564: no more pending results, returning what we have 8238 1726882392.27568: results queue empty 8238 1726882392.27568: checking for any_errors_fatal 8238 1726882392.27582: done checking for any_errors_fatal 8238 1726882392.27583: checking for max_fail_percentage 8238 1726882392.27585: done checking for max_fail_percentage 8238 1726882392.27586: checking to see if all hosts have failed and the running result is not ok 8238 1726882392.27586: done checking to see if all hosts have failed 8238 1726882392.27587: getting the remaining hosts for this loop 8238 1726882392.27588: done getting the remaining hosts for this loop 8238 1726882392.27591: getting the next task for host managed_node3 8238 1726882392.27596: done getting next task for host managed_node3 8238 1726882392.27600: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8238 1726882392.27603: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882392.27619: getting variables 8238 1726882392.27620: in VariableManager get_vars() 8238 1726882392.27655: Calling all_inventory to load vars for managed_node3 8238 1726882392.27657: Calling groups_inventory to load vars for managed_node3 8238 1726882392.27659: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882392.27668: Calling all_plugins_play to load vars for managed_node3 8238 1726882392.27670: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882392.27677: Calling groups_plugins_play to load vars for managed_node3 8238 1726882392.28569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882392.29707: done with get_vars() 8238 1726882392.29725: done getting variables 8238 1726882392.29771: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:12 -0400 (0:00:00.034) 0:00:22.453 ****** 8238 1726882392.29794: entering _queue_task() for managed_node3/debug 8238 1726882392.29986: worker is 1 (out of 1 available) 8238 1726882392.30000: exiting _queue_task() for managed_node3/debug 8238 1726882392.30013: done queuing things up, now waiting for results queue to drain 8238 1726882392.30015: waiting for pending results... 8238 1726882392.30185: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8238 1726882392.30268: in run() - task 0affc7ec-ae25-54bc-d334-000000000038 8238 1726882392.30280: variable 'ansible_search_path' from source: unknown 8238 1726882392.30284: variable 'ansible_search_path' from source: unknown 8238 1726882392.30313: calling self._execute() 8238 1726882392.30382: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.30387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.30395: variable 'omit' from source: magic vars 8238 1726882392.30677: variable 'ansible_distribution_major_version' from source: facts 8238 1726882392.30689: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882392.30695: variable 'omit' from source: magic vars 8238 1726882392.30735: variable 'omit' from source: magic vars 8238 1726882392.30763: variable 'omit' from source: magic vars 8238 1726882392.30798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882392.30827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882392.30841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882392.30858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882392.30867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882392.30892: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882392.30895: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.30900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.30975: Set connection var ansible_connection to ssh 8238 1726882392.30978: Set connection var ansible_shell_type to sh 8238 1726882392.30981: Set connection var ansible_pipelining to False 8238 1726882392.30987: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882392.30994: Set connection var ansible_timeout to 10 8238 1726882392.31001: Set connection var ansible_shell_executable to /bin/sh 8238 1726882392.31020: variable 'ansible_shell_executable' from source: unknown 8238 1726882392.31025: variable 'ansible_connection' from source: unknown 8238 1726882392.31028: variable 'ansible_module_compression' from source: unknown 8238 1726882392.31030: variable 'ansible_shell_type' from source: unknown 8238 1726882392.31033: variable 'ansible_shell_executable' from source: unknown 8238 1726882392.31035: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.31039: variable 'ansible_pipelining' from source: unknown 8238 1726882392.31042: variable 'ansible_timeout' from source: unknown 8238 1726882392.31046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.31155: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882392.31166: variable 'omit' from source: magic vars 8238 1726882392.31171: starting attempt loop 8238 1726882392.31174: running the handler 8238 1726882392.31274: variable '__network_connections_result' from source: set_fact 8238 1726882392.31322: handler run complete 8238 1726882392.31341: attempt loop complete, returning result 8238 1726882392.31344: _execute() done 8238 1726882392.31347: dumping result to json 8238 1726882392.31349: done dumping result, returning 8238 1726882392.31359: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-54bc-d334-000000000038] 8238 1726882392.31364: sending task result for task 0affc7ec-ae25-54bc-d334-000000000038 8238 1726882392.31452: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000038 8238 1726882392.31455: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d (not-active)" ] } 8238 1726882392.31518: no more pending results, returning what we have 8238 1726882392.31521: results queue empty 8238 1726882392.31524: checking for any_errors_fatal 8238 1726882392.31528: done checking for any_errors_fatal 8238 1726882392.31529: checking for max_fail_percentage 8238 1726882392.31530: done checking for max_fail_percentage 8238 1726882392.31531: checking to see if all hosts have failed and the running result is not ok 8238 1726882392.31532: done checking to see if all hosts have failed 8238 1726882392.31533: getting the remaining hosts for this loop 8238 1726882392.31534: done getting the remaining hosts for this loop 8238 1726882392.31538: getting the next task for host managed_node3 8238 1726882392.31543: done getting next task for host managed_node3 8238 1726882392.31546: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8238 1726882392.31549: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882392.31558: getting variables 8238 1726882392.31559: in VariableManager get_vars() 8238 1726882392.31591: Calling all_inventory to load vars for managed_node3 8238 1726882392.31594: Calling groups_inventory to load vars for managed_node3 8238 1726882392.31596: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882392.31605: Calling all_plugins_play to load vars for managed_node3 8238 1726882392.31608: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882392.31611: Calling groups_plugins_play to load vars for managed_node3 8238 1726882392.32604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882392.33725: done with get_vars() 8238 1726882392.33742: done getting variables 8238 1726882392.33785: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:12 -0400 (0:00:00.040) 0:00:22.493 ****** 8238 1726882392.33810: entering _queue_task() for managed_node3/debug 8238 1726882392.34003: worker is 1 (out of 1 available) 8238 1726882392.34018: exiting _queue_task() for managed_node3/debug 8238 1726882392.34030: done queuing things up, now waiting for results queue to drain 8238 1726882392.34032: waiting for pending results... 8238 1726882392.34208: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8238 1726882392.34300: in run() - task 0affc7ec-ae25-54bc-d334-000000000039 8238 1726882392.34311: variable 'ansible_search_path' from source: unknown 8238 1726882392.34314: variable 'ansible_search_path' from source: unknown 8238 1726882392.34349: calling self._execute() 8238 1726882392.34419: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.34424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.34434: variable 'omit' from source: magic vars 8238 1726882392.34718: variable 'ansible_distribution_major_version' from source: facts 8238 1726882392.34730: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882392.34735: variable 'omit' from source: magic vars 8238 1726882392.34780: variable 'omit' from source: magic vars 8238 1726882392.34810: variable 'omit' from source: magic vars 8238 1726882392.34844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882392.34875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882392.34889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882392.34903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882392.34917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882392.34944: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882392.34948: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.34953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.35029: Set connection var ansible_connection to ssh 8238 1726882392.35033: Set connection var ansible_shell_type to sh 8238 1726882392.35035: Set connection var ansible_pipelining to False 8238 1726882392.35041: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882392.35047: Set connection var ansible_timeout to 10 8238 1726882392.35128: Set connection var ansible_shell_executable to /bin/sh 8238 1726882392.35132: variable 'ansible_shell_executable' from source: unknown 8238 1726882392.35134: variable 'ansible_connection' from source: unknown 8238 1726882392.35137: variable 'ansible_module_compression' from source: unknown 8238 1726882392.35139: variable 'ansible_shell_type' from source: unknown 8238 1726882392.35141: variable 'ansible_shell_executable' from source: unknown 8238 1726882392.35143: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.35145: variable 'ansible_pipelining' from source: unknown 8238 1726882392.35147: variable 'ansible_timeout' from source: unknown 8238 1726882392.35150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.35195: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882392.35206: variable 'omit' from source: magic vars 8238 1726882392.35212: starting attempt loop 8238 1726882392.35215: running the handler 8238 1726882392.35326: variable '__network_connections_result' from source: set_fact 8238 1726882392.35329: variable '__network_connections_result' from source: set_fact 8238 1726882392.35438: handler run complete 8238 1726882392.35464: attempt loop complete, returning result 8238 1726882392.35467: _execute() done 8238 1726882392.35470: dumping result to json 8238 1726882392.35475: done dumping result, returning 8238 1726882392.35526: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-54bc-d334-000000000039] 8238 1726882392.35529: sending task result for task 0affc7ec-ae25-54bc-d334-000000000039 8238 1726882392.35592: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000039 8238 1726882392.35596: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8c66fb3e-08a1-411c-8a8a-97eb75d3c57e (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b1930de-0635-4914-bc1a-96ab6cfe44b6 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bcdb074f-88c7-45b9-82b3-bdb89677858d (not-active)" ] } } 8238 1726882392.35701: no more pending results, returning what we have 8238 1726882392.35704: results queue empty 8238 1726882392.35709: checking for any_errors_fatal 8238 1726882392.35714: done checking for any_errors_fatal 8238 1726882392.35715: checking for max_fail_percentage 8238 1726882392.35716: done checking for max_fail_percentage 8238 1726882392.35717: checking to see if all hosts have failed and the running result is not ok 8238 1726882392.35718: done checking to see if all hosts have failed 8238 1726882392.35719: getting the remaining hosts for this loop 8238 1726882392.35720: done getting the remaining hosts for this loop 8238 1726882392.35726: getting the next task for host managed_node3 8238 1726882392.35731: done getting next task for host managed_node3 8238 1726882392.35734: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8238 1726882392.35737: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882392.35746: getting variables 8238 1726882392.35747: in VariableManager get_vars() 8238 1726882392.35779: Calling all_inventory to load vars for managed_node3 8238 1726882392.35781: Calling groups_inventory to load vars for managed_node3 8238 1726882392.35783: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882392.35790: Calling all_plugins_play to load vars for managed_node3 8238 1726882392.35792: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882392.35793: Calling groups_plugins_play to load vars for managed_node3 8238 1726882392.36690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882392.37917: done with get_vars() 8238 1726882392.37936: done getting variables 8238 1726882392.37977: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:12 -0400 (0:00:00.041) 0:00:22.535 ****** 8238 1726882392.37998: entering _queue_task() for managed_node3/debug 8238 1726882392.38199: worker is 1 (out of 1 available) 8238 1726882392.38212: exiting _queue_task() for managed_node3/debug 8238 1726882392.38226: done queuing things up, now waiting for results queue to drain 8238 1726882392.38228: waiting for pending results... 8238 1726882392.38398: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8238 1726882392.38486: in run() - task 0affc7ec-ae25-54bc-d334-00000000003a 8238 1726882392.38499: variable 'ansible_search_path' from source: unknown 8238 1726882392.38502: variable 'ansible_search_path' from source: unknown 8238 1726882392.38535: calling self._execute() 8238 1726882392.38607: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.38611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.38620: variable 'omit' from source: magic vars 8238 1726882392.38906: variable 'ansible_distribution_major_version' from source: facts 8238 1726882392.38916: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882392.39006: variable 'network_state' from source: role '' defaults 8238 1726882392.39013: Evaluated conditional (network_state != {}): False 8238 1726882392.39016: when evaluation is False, skipping this task 8238 1726882392.39019: _execute() done 8238 1726882392.39025: dumping result to json 8238 1726882392.39028: done dumping result, returning 8238 1726882392.39037: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-54bc-d334-00000000003a] 8238 1726882392.39042: sending task result for task 0affc7ec-ae25-54bc-d334-00000000003a 8238 1726882392.39139: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000003a 8238 1726882392.39142: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 8238 1726882392.39191: no more pending results, returning what we have 8238 1726882392.39194: results queue empty 8238 1726882392.39195: checking for any_errors_fatal 8238 1726882392.39201: done checking for any_errors_fatal 8238 1726882392.39202: checking for max_fail_percentage 8238 1726882392.39203: done checking for max_fail_percentage 8238 1726882392.39204: checking to see if all hosts have failed and the running result is not ok 8238 1726882392.39206: done checking to see if all hosts have failed 8238 1726882392.39206: getting the remaining hosts for this loop 8238 1726882392.39208: done getting the remaining hosts for this loop 8238 1726882392.39210: getting the next task for host managed_node3 8238 1726882392.39214: done getting next task for host managed_node3 8238 1726882392.39217: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 8238 1726882392.39220: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882392.39237: getting variables 8238 1726882392.39239: in VariableManager get_vars() 8238 1726882392.39273: Calling all_inventory to load vars for managed_node3 8238 1726882392.39275: Calling groups_inventory to load vars for managed_node3 8238 1726882392.39277: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882392.39286: Calling all_plugins_play to load vars for managed_node3 8238 1726882392.39289: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882392.39291: Calling groups_plugins_play to load vars for managed_node3 8238 1726882392.40191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882392.41334: done with get_vars() 8238 1726882392.41351: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:12 -0400 (0:00:00.034) 0:00:22.569 ****** 8238 1726882392.41419: entering _queue_task() for managed_node3/ping 8238 1726882392.41421: Creating lock for ping 8238 1726882392.41620: worker is 1 (out of 1 available) 8238 1726882392.41636: exiting _queue_task() for managed_node3/ping 8238 1726882392.41649: done queuing things up, now waiting for results queue to drain 8238 1726882392.41650: waiting for pending results... 8238 1726882392.41830: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 8238 1726882392.41917: in run() - task 0affc7ec-ae25-54bc-d334-00000000003b 8238 1726882392.41931: variable 'ansible_search_path' from source: unknown 8238 1726882392.41935: variable 'ansible_search_path' from source: unknown 8238 1726882392.41968: calling self._execute() 8238 1726882392.42038: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.42045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.42056: variable 'omit' from source: magic vars 8238 1726882392.42340: variable 'ansible_distribution_major_version' from source: facts 8238 1726882392.42350: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882392.42358: variable 'omit' from source: magic vars 8238 1726882392.42403: variable 'omit' from source: magic vars 8238 1726882392.42432: variable 'omit' from source: magic vars 8238 1726882392.42467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882392.42497: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882392.42514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882392.42532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882392.42540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882392.42569: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882392.42572: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.42575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.42652: Set connection var ansible_connection to ssh 8238 1726882392.42656: Set connection var ansible_shell_type to sh 8238 1726882392.42663: Set connection var ansible_pipelining to False 8238 1726882392.42668: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882392.42674: Set connection var ansible_timeout to 10 8238 1726882392.42681: Set connection var ansible_shell_executable to /bin/sh 8238 1726882392.42699: variable 'ansible_shell_executable' from source: unknown 8238 1726882392.42702: variable 'ansible_connection' from source: unknown 8238 1726882392.42705: variable 'ansible_module_compression' from source: unknown 8238 1726882392.42707: variable 'ansible_shell_type' from source: unknown 8238 1726882392.42709: variable 'ansible_shell_executable' from source: unknown 8238 1726882392.42714: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882392.42724: variable 'ansible_pipelining' from source: unknown 8238 1726882392.42728: variable 'ansible_timeout' from source: unknown 8238 1726882392.42731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882392.42887: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882392.42897: variable 'omit' from source: magic vars 8238 1726882392.42902: starting attempt loop 8238 1726882392.42905: running the handler 8238 1726882392.42916: _low_level_execute_command(): starting 8238 1726882392.42925: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882392.43467: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882392.43470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882392.43473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882392.43476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882392.43478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882392.43529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882392.43533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882392.43535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882392.43627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882392.45287: stdout chunk (state=3): >>>/root <<< 8238 1726882392.45400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882392.45455: stderr chunk (state=3): >>><<< 8238 1726882392.45458: stdout chunk (state=3): >>><<< 8238 1726882392.45476: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882392.45488: _low_level_execute_command(): starting 8238 1726882392.45493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689 `" && echo ansible-tmp-1726882392.4547555-9091-195541854619689="` echo /root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689 `" ) && sleep 0' 8238 1726882392.45963: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882392.45967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882392.45969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882392.45977: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882392.45980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882392.46031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882392.46042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882392.46120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882392.48080: stdout chunk (state=3): >>>ansible-tmp-1726882392.4547555-9091-195541854619689=/root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689 <<< 8238 1726882392.48198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882392.48245: stderr chunk (state=3): >>><<< 8238 1726882392.48249: stdout chunk (state=3): >>><<< 8238 1726882392.48264: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882392.4547555-9091-195541854619689=/root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882392.48305: variable 'ansible_module_compression' from source: unknown 8238 1726882392.48341: ANSIBALLZ: Using lock for ping 8238 1726882392.48344: ANSIBALLZ: Acquiring lock 8238 1726882392.48346: ANSIBALLZ: Lock acquired: 140036204353136 8238 1726882392.48349: ANSIBALLZ: Creating module 8238 1726882392.56262: ANSIBALLZ: Writing module into payload 8238 1726882392.56306: ANSIBALLZ: Writing module 8238 1726882392.56325: ANSIBALLZ: Renaming module 8238 1726882392.56331: ANSIBALLZ: Done creating module 8238 1726882392.56347: variable 'ansible_facts' from source: unknown 8238 1726882392.56395: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/AnsiballZ_ping.py 8238 1726882392.56505: Sending initial data 8238 1726882392.56509: Sent initial data (151 bytes) 8238 1726882392.57000: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882392.57004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882392.57006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882392.57008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882392.57013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882392.57076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882392.57079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882392.57086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882392.57170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882392.58798: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8238 1726882392.58802: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882392.58881: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882392.58969: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpp6nrko0n /root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/AnsiballZ_ping.py <<< 8238 1726882392.58973: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/AnsiballZ_ping.py" <<< 8238 1726882392.59053: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpp6nrko0n" to remote "/root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/AnsiballZ_ping.py" <<< 8238 1726882392.59763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882392.59828: stderr chunk (state=3): >>><<< 8238 1726882392.59833: stdout chunk (state=3): >>><<< 8238 1726882392.59852: done transferring module to remote 8238 1726882392.59868: _low_level_execute_command(): starting 8238 1726882392.59871: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/ /root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/AnsiballZ_ping.py && sleep 0' 8238 1726882392.60336: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882392.60339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882392.60342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882392.60348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882392.60395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882392.60399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882392.60489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882392.62289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882392.62337: stderr chunk (state=3): >>><<< 8238 1726882392.62341: stdout chunk (state=3): >>><<< 8238 1726882392.62358: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882392.62361: _low_level_execute_command(): starting 8238 1726882392.62365: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/AnsiballZ_ping.py && sleep 0' 8238 1726882392.62856: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882392.62859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882392.62862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882392.62866: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882392.62868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882392.62914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882392.62924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882392.63012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882392.79481: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 8238 1726882392.80936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882392.80953: stderr chunk (state=3): >>><<< 8238 1726882392.80964: stdout chunk (state=3): >>><<< 8238 1726882392.80988: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882392.81044: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882392.81048: _low_level_execute_command(): starting 8238 1726882392.81134: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882392.4547555-9091-195541854619689/ > /dev/null 2>&1 && sleep 0' 8238 1726882392.82239: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882392.82334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882392.82484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882392.82505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882392.82753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882392.84695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882392.84933: stderr chunk (state=3): >>><<< 8238 1726882392.84936: stdout chunk (state=3): >>><<< 8238 1726882392.84951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882392.84963: handler run complete 8238 1726882392.84987: attempt loop complete, returning result 8238 1726882392.84994: _execute() done 8238 1726882392.85001: dumping result to json 8238 1726882392.85009: done dumping result, returning 8238 1726882392.85098: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-54bc-d334-00000000003b] 8238 1726882392.85131: sending task result for task 0affc7ec-ae25-54bc-d334-00000000003b ok: [managed_node3] => { "changed": false, "ping": "pong" } 8238 1726882392.85296: no more pending results, returning what we have 8238 1726882392.85300: results queue empty 8238 1726882392.85301: checking for any_errors_fatal 8238 1726882392.85306: done checking for any_errors_fatal 8238 1726882392.85307: checking for max_fail_percentage 8238 1726882392.85309: done checking for max_fail_percentage 8238 1726882392.85309: checking to see if all hosts have failed and the running result is not ok 8238 1726882392.85310: done checking to see if all hosts have failed 8238 1726882392.85311: getting the remaining hosts for this loop 8238 1726882392.85313: done getting the remaining hosts for this loop 8238 1726882392.85317: getting the next task for host managed_node3 8238 1726882392.85329: done getting next task for host managed_node3 8238 1726882392.85332: ^ task is: TASK: meta (role_complete) 8238 1726882392.85336: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882392.85347: getting variables 8238 1726882392.85349: in VariableManager get_vars() 8238 1726882392.85395: Calling all_inventory to load vars for managed_node3 8238 1726882392.85398: Calling groups_inventory to load vars for managed_node3 8238 1726882392.85400: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882392.85411: Calling all_plugins_play to load vars for managed_node3 8238 1726882392.85414: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882392.85417: Calling groups_plugins_play to load vars for managed_node3 8238 1726882392.86559: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000003b 8238 1726882392.86563: WORKER PROCESS EXITING 8238 1726882392.88309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882392.91254: done with get_vars() 8238 1726882392.91287: done getting variables 8238 1726882392.91385: done queuing things up, now waiting for results queue to drain 8238 1726882392.91388: results queue empty 8238 1726882392.91389: checking for any_errors_fatal 8238 1726882392.91392: done checking for any_errors_fatal 8238 1726882392.91393: checking for max_fail_percentage 8238 1726882392.91394: done checking for max_fail_percentage 8238 1726882392.91395: checking to see if all hosts have failed and the running result is not ok 8238 1726882392.91396: done checking to see if all hosts have failed 8238 1726882392.91396: getting the remaining hosts for this loop 8238 1726882392.91397: done getting the remaining hosts for this loop 8238 1726882392.91400: getting the next task for host managed_node3 8238 1726882392.91405: done getting next task for host managed_node3 8238 1726882392.91407: ^ task is: TASK: Include the task 'get_interface_stat.yml' 8238 1726882392.91409: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882392.91411: getting variables 8238 1726882392.91412: in VariableManager get_vars() 8238 1726882392.91432: Calling all_inventory to load vars for managed_node3 8238 1726882392.91435: Calling groups_inventory to load vars for managed_node3 8238 1726882392.91437: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882392.91442: Calling all_plugins_play to load vars for managed_node3 8238 1726882392.91445: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882392.91447: Calling groups_plugins_play to load vars for managed_node3 8238 1726882392.94088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882392.98796: done with get_vars() 8238 1726882392.98825: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:12 -0400 (0:00:00.574) 0:00:23.144 ****** 8238 1726882392.98920: entering _queue_task() for managed_node3/include_tasks 8238 1726882392.99970: worker is 1 (out of 1 available) 8238 1726882392.99984: exiting _queue_task() for managed_node3/include_tasks 8238 1726882392.99996: done queuing things up, now waiting for results queue to drain 8238 1726882392.99998: waiting for pending results... 8238 1726882393.00279: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 8238 1726882393.00421: in run() - task 0affc7ec-ae25-54bc-d334-00000000006e 8238 1726882393.00450: variable 'ansible_search_path' from source: unknown 8238 1726882393.00462: variable 'ansible_search_path' from source: unknown 8238 1726882393.00506: calling self._execute() 8238 1726882393.00616: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.00632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.00646: variable 'omit' from source: magic vars 8238 1726882393.01071: variable 'ansible_distribution_major_version' from source: facts 8238 1726882393.01088: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882393.01098: _execute() done 8238 1726882393.01109: dumping result to json 8238 1726882393.01116: done dumping result, returning 8238 1726882393.01127: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affc7ec-ae25-54bc-d334-00000000006e] 8238 1726882393.01214: sending task result for task 0affc7ec-ae25-54bc-d334-00000000006e 8238 1726882393.01306: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000006e 8238 1726882393.01310: WORKER PROCESS EXITING 8238 1726882393.01396: no more pending results, returning what we have 8238 1726882393.01401: in VariableManager get_vars() 8238 1726882393.01598: Calling all_inventory to load vars for managed_node3 8238 1726882393.01602: Calling groups_inventory to load vars for managed_node3 8238 1726882393.01605: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882393.01617: Calling all_plugins_play to load vars for managed_node3 8238 1726882393.01620: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882393.01626: Calling groups_plugins_play to load vars for managed_node3 8238 1726882393.03800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882393.05647: done with get_vars() 8238 1726882393.05673: variable 'ansible_search_path' from source: unknown 8238 1726882393.05674: variable 'ansible_search_path' from source: unknown 8238 1726882393.05705: we have included files to process 8238 1726882393.05706: generating all_blocks data 8238 1726882393.05707: done generating all_blocks data 8238 1726882393.05711: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8238 1726882393.05712: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8238 1726882393.05713: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8238 1726882393.05861: done processing included file 8238 1726882393.05863: iterating over new_blocks loaded from include file 8238 1726882393.05864: in VariableManager get_vars() 8238 1726882393.05880: done with get_vars() 8238 1726882393.05881: filtering new block on tags 8238 1726882393.05894: done filtering new block on tags 8238 1726882393.05896: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 8238 1726882393.05900: extending task lists for all hosts with included blocks 8238 1726882393.05993: done extending task lists 8238 1726882393.05994: done processing included files 8238 1726882393.05995: results queue empty 8238 1726882393.05995: checking for any_errors_fatal 8238 1726882393.05996: done checking for any_errors_fatal 8238 1726882393.05997: checking for max_fail_percentage 8238 1726882393.05997: done checking for max_fail_percentage 8238 1726882393.05998: checking to see if all hosts have failed and the running result is not ok 8238 1726882393.05999: done checking to see if all hosts have failed 8238 1726882393.05999: getting the remaining hosts for this loop 8238 1726882393.06000: done getting the remaining hosts for this loop 8238 1726882393.06002: getting the next task for host managed_node3 8238 1726882393.06006: done getting next task for host managed_node3 8238 1726882393.06011: ^ task is: TASK: Get stat for interface {{ interface }} 8238 1726882393.06015: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882393.06019: getting variables 8238 1726882393.06020: in VariableManager get_vars() 8238 1726882393.06071: Calling all_inventory to load vars for managed_node3 8238 1726882393.06073: Calling groups_inventory to load vars for managed_node3 8238 1726882393.06075: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882393.06081: Calling all_plugins_play to load vars for managed_node3 8238 1726882393.06083: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882393.06086: Calling groups_plugins_play to load vars for managed_node3 8238 1726882393.12929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882393.15076: done with get_vars() 8238 1726882393.15107: done getting variables 8238 1726882393.15485: variable 'interface' from source: task vars 8238 1726882393.15489: variable 'controller_device' from source: play vars 8238 1726882393.15556: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:13 -0400 (0:00:00.166) 0:00:23.310 ****** 8238 1726882393.15586: entering _queue_task() for managed_node3/stat 8238 1726882393.16341: worker is 1 (out of 1 available) 8238 1726882393.16353: exiting _queue_task() for managed_node3/stat 8238 1726882393.16364: done queuing things up, now waiting for results queue to drain 8238 1726882393.16366: waiting for pending results... 8238 1726882393.16678: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 8238 1726882393.16747: in run() - task 0affc7ec-ae25-54bc-d334-000000000241 8238 1726882393.16761: variable 'ansible_search_path' from source: unknown 8238 1726882393.16889: variable 'ansible_search_path' from source: unknown 8238 1726882393.16896: calling self._execute() 8238 1726882393.16906: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.16916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.16926: variable 'omit' from source: magic vars 8238 1726882393.17318: variable 'ansible_distribution_major_version' from source: facts 8238 1726882393.17323: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882393.17333: variable 'omit' from source: magic vars 8238 1726882393.17399: variable 'omit' from source: magic vars 8238 1726882393.17501: variable 'interface' from source: task vars 8238 1726882393.17505: variable 'controller_device' from source: play vars 8238 1726882393.17576: variable 'controller_device' from source: play vars 8238 1726882393.17595: variable 'omit' from source: magic vars 8238 1726882393.17641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882393.17685: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882393.17704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882393.17749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882393.17760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882393.17768: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882393.17828: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.17832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.17897: Set connection var ansible_connection to ssh 8238 1726882393.17900: Set connection var ansible_shell_type to sh 8238 1726882393.17906: Set connection var ansible_pipelining to False 8238 1726882393.17912: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882393.17918: Set connection var ansible_timeout to 10 8238 1726882393.17929: Set connection var ansible_shell_executable to /bin/sh 8238 1726882393.17955: variable 'ansible_shell_executable' from source: unknown 8238 1726882393.17959: variable 'ansible_connection' from source: unknown 8238 1726882393.17977: variable 'ansible_module_compression' from source: unknown 8238 1726882393.17981: variable 'ansible_shell_type' from source: unknown 8238 1726882393.17983: variable 'ansible_shell_executable' from source: unknown 8238 1726882393.17986: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.17988: variable 'ansible_pipelining' from source: unknown 8238 1726882393.17992: variable 'ansible_timeout' from source: unknown 8238 1726882393.17995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.18230: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882393.18234: variable 'omit' from source: magic vars 8238 1726882393.18237: starting attempt loop 8238 1726882393.18240: running the handler 8238 1726882393.18242: _low_level_execute_command(): starting 8238 1726882393.18244: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882393.19030: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882393.19040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.19096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882393.19111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882393.19125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.19246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.21043: stdout chunk (state=3): >>>/root <<< 8238 1726882393.21242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882393.21245: stdout chunk (state=3): >>><<< 8238 1726882393.21248: stderr chunk (state=3): >>><<< 8238 1726882393.21273: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882393.21293: _low_level_execute_command(): starting 8238 1726882393.21353: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655 `" && echo ansible-tmp-1726882393.2127967-9123-195226546433655="` echo /root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655 `" ) && sleep 0' 8238 1726882393.21977: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882393.21991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882393.22121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.22140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882393.22183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.22270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.24248: stdout chunk (state=3): >>>ansible-tmp-1726882393.2127967-9123-195226546433655=/root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655 <<< 8238 1726882393.24456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882393.24461: stdout chunk (state=3): >>><<< 8238 1726882393.24463: stderr chunk (state=3): >>><<< 8238 1726882393.24635: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882393.2127967-9123-195226546433655=/root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882393.24638: variable 'ansible_module_compression' from source: unknown 8238 1726882393.24640: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8238 1726882393.24665: variable 'ansible_facts' from source: unknown 8238 1726882393.24779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/AnsiballZ_stat.py 8238 1726882393.24981: Sending initial data 8238 1726882393.24991: Sent initial data (151 bytes) 8238 1726882393.25685: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882393.25700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882393.25714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882393.25745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882393.25838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.25842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882393.25877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.25990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.27603: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882393.27706: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882393.27824: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpgaqysmpq /root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/AnsiballZ_stat.py <<< 8238 1726882393.27847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/AnsiballZ_stat.py" <<< 8238 1726882393.27918: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpgaqysmpq" to remote "/root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/AnsiballZ_stat.py" <<< 8238 1726882393.29074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882393.29078: stdout chunk (state=3): >>><<< 8238 1726882393.29080: stderr chunk (state=3): >>><<< 8238 1726882393.29082: done transferring module to remote 8238 1726882393.29084: _low_level_execute_command(): starting 8238 1726882393.29087: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/ /root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/AnsiballZ_stat.py && sleep 0' 8238 1726882393.29728: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882393.29731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882393.29734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882393.29747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.29827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882393.29854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882393.29896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.29995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.31881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882393.31885: stdout chunk (state=3): >>><<< 8238 1726882393.31887: stderr chunk (state=3): >>><<< 8238 1726882393.31988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882393.31992: _low_level_execute_command(): starting 8238 1726882393.31994: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/AnsiballZ_stat.py && sleep 0' 8238 1726882393.32593: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882393.32638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882393.32727: stderr chunk (state=3): >>>debug2: match found <<< 8238 1726882393.32747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882393.32794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.32887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.49545: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35810, "dev": 23, "nlink": 1, "atime": 1726882391.9054227, "mtime": 1726882391.9054227, "ctime": 1726882391.9054227, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8238 1726882393.51062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882393.51113: stderr chunk (state=3): >>><<< 8238 1726882393.51117: stdout chunk (state=3): >>><<< 8238 1726882393.51135: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35810, "dev": 23, "nlink": 1, "atime": 1726882391.9054227, "mtime": 1726882391.9054227, "ctime": 1726882391.9054227, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882393.51176: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882393.51188: _low_level_execute_command(): starting 8238 1726882393.51195: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882393.2127967-9123-195226546433655/ > /dev/null 2>&1 && sleep 0' 8238 1726882393.51610: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882393.51615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882393.51650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.51657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882393.51660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.51704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882393.51707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.51802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.53814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882393.53857: stderr chunk (state=3): >>><<< 8238 1726882393.53861: stdout chunk (state=3): >>><<< 8238 1726882393.53872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882393.53878: handler run complete 8238 1726882393.53914: attempt loop complete, returning result 8238 1726882393.53917: _execute() done 8238 1726882393.53925: dumping result to json 8238 1726882393.53933: done dumping result, returning 8238 1726882393.53940: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [0affc7ec-ae25-54bc-d334-000000000241] 8238 1726882393.53946: sending task result for task 0affc7ec-ae25-54bc-d334-000000000241 8238 1726882393.54065: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000241 8238 1726882393.54068: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882391.9054227, "block_size": 4096, "blocks": 0, "ctime": 1726882391.9054227, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35810, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726882391.9054227, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8238 1726882393.54151: no more pending results, returning what we have 8238 1726882393.54155: results queue empty 8238 1726882393.54156: checking for any_errors_fatal 8238 1726882393.54158: done checking for any_errors_fatal 8238 1726882393.54158: checking for max_fail_percentage 8238 1726882393.54160: done checking for max_fail_percentage 8238 1726882393.54161: checking to see if all hosts have failed and the running result is not ok 8238 1726882393.54162: done checking to see if all hosts have failed 8238 1726882393.54162: getting the remaining hosts for this loop 8238 1726882393.54164: done getting the remaining hosts for this loop 8238 1726882393.54168: getting the next task for host managed_node3 8238 1726882393.54177: done getting next task for host managed_node3 8238 1726882393.54179: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 8238 1726882393.54182: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882393.54188: getting variables 8238 1726882393.54189: in VariableManager get_vars() 8238 1726882393.54232: Calling all_inventory to load vars for managed_node3 8238 1726882393.54235: Calling groups_inventory to load vars for managed_node3 8238 1726882393.54237: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882393.54254: Calling all_plugins_play to load vars for managed_node3 8238 1726882393.54257: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882393.54260: Calling groups_plugins_play to load vars for managed_node3 8238 1726882393.55716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882393.57317: done with get_vars() 8238 1726882393.57336: done getting variables 8238 1726882393.57382: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882393.57475: variable 'interface' from source: task vars 8238 1726882393.57478: variable 'controller_device' from source: play vars 8238 1726882393.57520: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:13 -0400 (0:00:00.419) 0:00:23.730 ****** 8238 1726882393.57546: entering _queue_task() for managed_node3/assert 8238 1726882393.57778: worker is 1 (out of 1 available) 8238 1726882393.57791: exiting _queue_task() for managed_node3/assert 8238 1726882393.57804: done queuing things up, now waiting for results queue to drain 8238 1726882393.57806: waiting for pending results... 8238 1726882393.57989: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 8238 1726882393.58079: in run() - task 0affc7ec-ae25-54bc-d334-00000000006f 8238 1726882393.58090: variable 'ansible_search_path' from source: unknown 8238 1726882393.58093: variable 'ansible_search_path' from source: unknown 8238 1726882393.58126: calling self._execute() 8238 1726882393.58204: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.58211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.58220: variable 'omit' from source: magic vars 8238 1726882393.58631: variable 'ansible_distribution_major_version' from source: facts 8238 1726882393.58634: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882393.58637: variable 'omit' from source: magic vars 8238 1726882393.58640: variable 'omit' from source: magic vars 8238 1726882393.58715: variable 'interface' from source: task vars 8238 1726882393.58927: variable 'controller_device' from source: play vars 8238 1726882393.58930: variable 'controller_device' from source: play vars 8238 1726882393.58933: variable 'omit' from source: magic vars 8238 1726882393.58935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882393.58938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882393.58941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882393.58964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882393.58979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882393.59018: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882393.59031: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.59039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.59163: Set connection var ansible_connection to ssh 8238 1726882393.59174: Set connection var ansible_shell_type to sh 8238 1726882393.59186: Set connection var ansible_pipelining to False 8238 1726882393.59197: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882393.59210: Set connection var ansible_timeout to 10 8238 1726882393.59226: Set connection var ansible_shell_executable to /bin/sh 8238 1726882393.59256: variable 'ansible_shell_executable' from source: unknown 8238 1726882393.59267: variable 'ansible_connection' from source: unknown 8238 1726882393.59279: variable 'ansible_module_compression' from source: unknown 8238 1726882393.59287: variable 'ansible_shell_type' from source: unknown 8238 1726882393.59295: variable 'ansible_shell_executable' from source: unknown 8238 1726882393.59303: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.59311: variable 'ansible_pipelining' from source: unknown 8238 1726882393.59317: variable 'ansible_timeout' from source: unknown 8238 1726882393.59329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.59498: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882393.59519: variable 'omit' from source: magic vars 8238 1726882393.59534: starting attempt loop 8238 1726882393.59537: running the handler 8238 1726882393.59638: variable 'interface_stat' from source: set_fact 8238 1726882393.59652: Evaluated conditional (interface_stat.stat.exists): True 8238 1726882393.59660: handler run complete 8238 1726882393.59672: attempt loop complete, returning result 8238 1726882393.59675: _execute() done 8238 1726882393.59677: dumping result to json 8238 1726882393.59680: done dumping result, returning 8238 1726882393.59687: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [0affc7ec-ae25-54bc-d334-00000000006f] 8238 1726882393.59693: sending task result for task 0affc7ec-ae25-54bc-d334-00000000006f 8238 1726882393.59784: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000006f 8238 1726882393.59787: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882393.59870: no more pending results, returning what we have 8238 1726882393.59874: results queue empty 8238 1726882393.59875: checking for any_errors_fatal 8238 1726882393.59879: done checking for any_errors_fatal 8238 1726882393.59880: checking for max_fail_percentage 8238 1726882393.59882: done checking for max_fail_percentage 8238 1726882393.59883: checking to see if all hosts have failed and the running result is not ok 8238 1726882393.59884: done checking to see if all hosts have failed 8238 1726882393.59884: getting the remaining hosts for this loop 8238 1726882393.59885: done getting the remaining hosts for this loop 8238 1726882393.59888: getting the next task for host managed_node3 8238 1726882393.59895: done getting next task for host managed_node3 8238 1726882393.59898: ^ task is: TASK: Include the task 'assert_profile_present.yml' 8238 1726882393.59900: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882393.59903: getting variables 8238 1726882393.59904: in VariableManager get_vars() 8238 1726882393.59940: Calling all_inventory to load vars for managed_node3 8238 1726882393.59943: Calling groups_inventory to load vars for managed_node3 8238 1726882393.59945: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882393.59954: Calling all_plugins_play to load vars for managed_node3 8238 1726882393.59956: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882393.59959: Calling groups_plugins_play to load vars for managed_node3 8238 1726882393.60872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882393.62013: done with get_vars() 8238 1726882393.62032: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Friday 20 September 2024 21:33:13 -0400 (0:00:00.045) 0:00:23.776 ****** 8238 1726882393.62099: entering _queue_task() for managed_node3/include_tasks 8238 1726882393.62310: worker is 1 (out of 1 available) 8238 1726882393.62327: exiting _queue_task() for managed_node3/include_tasks 8238 1726882393.62339: done queuing things up, now waiting for results queue to drain 8238 1726882393.62341: waiting for pending results... 8238 1726882393.62523: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 8238 1726882393.62589: in run() - task 0affc7ec-ae25-54bc-d334-000000000070 8238 1726882393.62600: variable 'ansible_search_path' from source: unknown 8238 1726882393.62644: variable 'controller_profile' from source: play vars 8238 1726882393.62803: variable 'controller_profile' from source: play vars 8238 1726882393.62815: variable 'port1_profile' from source: play vars 8238 1726882393.62868: variable 'port1_profile' from source: play vars 8238 1726882393.62874: variable 'port2_profile' from source: play vars 8238 1726882393.62926: variable 'port2_profile' from source: play vars 8238 1726882393.62937: variable 'omit' from source: magic vars 8238 1726882393.63042: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.63049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.63061: variable 'omit' from source: magic vars 8238 1726882393.63245: variable 'ansible_distribution_major_version' from source: facts 8238 1726882393.63255: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882393.63274: variable 'item' from source: unknown 8238 1726882393.63320: variable 'item' from source: unknown 8238 1726882393.63453: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.63457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.63460: variable 'omit' from source: magic vars 8238 1726882393.63544: variable 'ansible_distribution_major_version' from source: facts 8238 1726882393.63547: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882393.63576: variable 'item' from source: unknown 8238 1726882393.63618: variable 'item' from source: unknown 8238 1726882393.63691: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.63694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.63816: variable 'omit' from source: magic vars 8238 1726882393.63820: variable 'ansible_distribution_major_version' from source: facts 8238 1726882393.63824: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882393.63839: variable 'item' from source: unknown 8238 1726882393.63885: variable 'item' from source: unknown 8238 1726882393.63949: dumping result to json 8238 1726882393.63955: done dumping result, returning 8238 1726882393.63958: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0affc7ec-ae25-54bc-d334-000000000070] 8238 1726882393.63961: sending task result for task 0affc7ec-ae25-54bc-d334-000000000070 8238 1726882393.63997: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000070 8238 1726882393.64000: WORKER PROCESS EXITING 8238 1726882393.64038: no more pending results, returning what we have 8238 1726882393.64042: in VariableManager get_vars() 8238 1726882393.64089: Calling all_inventory to load vars for managed_node3 8238 1726882393.64093: Calling groups_inventory to load vars for managed_node3 8238 1726882393.64095: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882393.64106: Calling all_plugins_play to load vars for managed_node3 8238 1726882393.64108: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882393.64113: Calling groups_plugins_play to load vars for managed_node3 8238 1726882393.65192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882393.66326: done with get_vars() 8238 1726882393.66341: variable 'ansible_search_path' from source: unknown 8238 1726882393.66356: variable 'ansible_search_path' from source: unknown 8238 1726882393.66362: variable 'ansible_search_path' from source: unknown 8238 1726882393.66368: we have included files to process 8238 1726882393.66369: generating all_blocks data 8238 1726882393.66370: done generating all_blocks data 8238 1726882393.66374: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8238 1726882393.66375: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8238 1726882393.66377: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8238 1726882393.66520: in VariableManager get_vars() 8238 1726882393.66539: done with get_vars() 8238 1726882393.66729: done processing included file 8238 1726882393.66731: iterating over new_blocks loaded from include file 8238 1726882393.66732: in VariableManager get_vars() 8238 1726882393.66744: done with get_vars() 8238 1726882393.66745: filtering new block on tags 8238 1726882393.66761: done filtering new block on tags 8238 1726882393.66763: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 8238 1726882393.66766: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8238 1726882393.66767: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8238 1726882393.66769: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8238 1726882393.66837: in VariableManager get_vars() 8238 1726882393.66854: done with get_vars() 8238 1726882393.67014: done processing included file 8238 1726882393.67015: iterating over new_blocks loaded from include file 8238 1726882393.67016: in VariableManager get_vars() 8238 1726882393.67033: done with get_vars() 8238 1726882393.67034: filtering new block on tags 8238 1726882393.67046: done filtering new block on tags 8238 1726882393.67048: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 8238 1726882393.67050: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8238 1726882393.67053: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8238 1726882393.67055: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8238 1726882393.67162: in VariableManager get_vars() 8238 1726882393.67178: done with get_vars() 8238 1726882393.67339: done processing included file 8238 1726882393.67340: iterating over new_blocks loaded from include file 8238 1726882393.67341: in VariableManager get_vars() 8238 1726882393.67355: done with get_vars() 8238 1726882393.67357: filtering new block on tags 8238 1726882393.67370: done filtering new block on tags 8238 1726882393.67371: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 8238 1726882393.67374: extending task lists for all hosts with included blocks 8238 1726882393.69112: done extending task lists 8238 1726882393.69118: done processing included files 8238 1726882393.69118: results queue empty 8238 1726882393.69119: checking for any_errors_fatal 8238 1726882393.69121: done checking for any_errors_fatal 8238 1726882393.69123: checking for max_fail_percentage 8238 1726882393.69124: done checking for max_fail_percentage 8238 1726882393.69124: checking to see if all hosts have failed and the running result is not ok 8238 1726882393.69125: done checking to see if all hosts have failed 8238 1726882393.69125: getting the remaining hosts for this loop 8238 1726882393.69126: done getting the remaining hosts for this loop 8238 1726882393.69128: getting the next task for host managed_node3 8238 1726882393.69130: done getting next task for host managed_node3 8238 1726882393.69132: ^ task is: TASK: Include the task 'get_profile_stat.yml' 8238 1726882393.69133: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882393.69135: getting variables 8238 1726882393.69136: in VariableManager get_vars() 8238 1726882393.69144: Calling all_inventory to load vars for managed_node3 8238 1726882393.69146: Calling groups_inventory to load vars for managed_node3 8238 1726882393.69147: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882393.69153: Calling all_plugins_play to load vars for managed_node3 8238 1726882393.69155: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882393.69157: Calling groups_plugins_play to load vars for managed_node3 8238 1726882393.69999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882393.71144: done with get_vars() 8238 1726882393.71162: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:13 -0400 (0:00:00.091) 0:00:23.867 ****** 8238 1726882393.71215: entering _queue_task() for managed_node3/include_tasks 8238 1726882393.71489: worker is 1 (out of 1 available) 8238 1726882393.71502: exiting _queue_task() for managed_node3/include_tasks 8238 1726882393.71516: done queuing things up, now waiting for results queue to drain 8238 1726882393.71517: waiting for pending results... 8238 1726882393.71689: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 8238 1726882393.71762: in run() - task 0affc7ec-ae25-54bc-d334-00000000025f 8238 1726882393.71773: variable 'ansible_search_path' from source: unknown 8238 1726882393.71776: variable 'ansible_search_path' from source: unknown 8238 1726882393.71810: calling self._execute() 8238 1726882393.71887: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.71891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.71900: variable 'omit' from source: magic vars 8238 1726882393.72205: variable 'ansible_distribution_major_version' from source: facts 8238 1726882393.72216: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882393.72223: _execute() done 8238 1726882393.72226: dumping result to json 8238 1726882393.72231: done dumping result, returning 8238 1726882393.72237: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affc7ec-ae25-54bc-d334-00000000025f] 8238 1726882393.72242: sending task result for task 0affc7ec-ae25-54bc-d334-00000000025f 8238 1726882393.72332: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000025f 8238 1726882393.72335: WORKER PROCESS EXITING 8238 1726882393.72364: no more pending results, returning what we have 8238 1726882393.72369: in VariableManager get_vars() 8238 1726882393.72414: Calling all_inventory to load vars for managed_node3 8238 1726882393.72417: Calling groups_inventory to load vars for managed_node3 8238 1726882393.72419: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882393.72440: Calling all_plugins_play to load vars for managed_node3 8238 1726882393.72444: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882393.72447: Calling groups_plugins_play to load vars for managed_node3 8238 1726882393.73409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882393.74556: done with get_vars() 8238 1726882393.74571: variable 'ansible_search_path' from source: unknown 8238 1726882393.74572: variable 'ansible_search_path' from source: unknown 8238 1726882393.74602: we have included files to process 8238 1726882393.74603: generating all_blocks data 8238 1726882393.74604: done generating all_blocks data 8238 1726882393.74605: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8238 1726882393.74606: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8238 1726882393.74608: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8238 1726882393.75332: done processing included file 8238 1726882393.75334: iterating over new_blocks loaded from include file 8238 1726882393.75335: in VariableManager get_vars() 8238 1726882393.75348: done with get_vars() 8238 1726882393.75350: filtering new block on tags 8238 1726882393.75367: done filtering new block on tags 8238 1726882393.75369: in VariableManager get_vars() 8238 1726882393.75381: done with get_vars() 8238 1726882393.75382: filtering new block on tags 8238 1726882393.75397: done filtering new block on tags 8238 1726882393.75398: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 8238 1726882393.75401: extending task lists for all hosts with included blocks 8238 1726882393.75515: done extending task lists 8238 1726882393.75516: done processing included files 8238 1726882393.75516: results queue empty 8238 1726882393.75517: checking for any_errors_fatal 8238 1726882393.75519: done checking for any_errors_fatal 8238 1726882393.75519: checking for max_fail_percentage 8238 1726882393.75520: done checking for max_fail_percentage 8238 1726882393.75521: checking to see if all hosts have failed and the running result is not ok 8238 1726882393.75521: done checking to see if all hosts have failed 8238 1726882393.75523: getting the remaining hosts for this loop 8238 1726882393.75524: done getting the remaining hosts for this loop 8238 1726882393.75526: getting the next task for host managed_node3 8238 1726882393.75528: done getting next task for host managed_node3 8238 1726882393.75530: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 8238 1726882393.75532: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882393.75534: getting variables 8238 1726882393.75535: in VariableManager get_vars() 8238 1726882393.75670: Calling all_inventory to load vars for managed_node3 8238 1726882393.75673: Calling groups_inventory to load vars for managed_node3 8238 1726882393.75674: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882393.75678: Calling all_plugins_play to load vars for managed_node3 8238 1726882393.75681: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882393.75683: Calling groups_plugins_play to load vars for managed_node3 8238 1726882393.76469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882393.77630: done with get_vars() 8238 1726882393.77651: done getting variables 8238 1726882393.77685: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:13 -0400 (0:00:00.064) 0:00:23.932 ****** 8238 1726882393.77708: entering _queue_task() for managed_node3/set_fact 8238 1726882393.77980: worker is 1 (out of 1 available) 8238 1726882393.77993: exiting _queue_task() for managed_node3/set_fact 8238 1726882393.78007: done queuing things up, now waiting for results queue to drain 8238 1726882393.78009: waiting for pending results... 8238 1726882393.78194: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 8238 1726882393.78267: in run() - task 0affc7ec-ae25-54bc-d334-0000000003b0 8238 1726882393.78278: variable 'ansible_search_path' from source: unknown 8238 1726882393.78281: variable 'ansible_search_path' from source: unknown 8238 1726882393.78312: calling self._execute() 8238 1726882393.78393: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.78397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.78408: variable 'omit' from source: magic vars 8238 1726882393.78712: variable 'ansible_distribution_major_version' from source: facts 8238 1726882393.78724: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882393.78731: variable 'omit' from source: magic vars 8238 1726882393.78769: variable 'omit' from source: magic vars 8238 1726882393.78798: variable 'omit' from source: magic vars 8238 1726882393.78835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882393.78868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882393.78885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882393.78904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882393.78912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882393.78940: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882393.78944: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.78947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.79126: Set connection var ansible_connection to ssh 8238 1726882393.79130: Set connection var ansible_shell_type to sh 8238 1726882393.79133: Set connection var ansible_pipelining to False 8238 1726882393.79136: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882393.79138: Set connection var ansible_timeout to 10 8238 1726882393.79140: Set connection var ansible_shell_executable to /bin/sh 8238 1726882393.79142: variable 'ansible_shell_executable' from source: unknown 8238 1726882393.79147: variable 'ansible_connection' from source: unknown 8238 1726882393.79151: variable 'ansible_module_compression' from source: unknown 8238 1726882393.79153: variable 'ansible_shell_type' from source: unknown 8238 1726882393.79156: variable 'ansible_shell_executable' from source: unknown 8238 1726882393.79158: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.79160: variable 'ansible_pipelining' from source: unknown 8238 1726882393.79162: variable 'ansible_timeout' from source: unknown 8238 1726882393.79164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.79296: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882393.79313: variable 'omit' from source: magic vars 8238 1726882393.79326: starting attempt loop 8238 1726882393.79333: running the handler 8238 1726882393.79354: handler run complete 8238 1726882393.79528: attempt loop complete, returning result 8238 1726882393.79532: _execute() done 8238 1726882393.79535: dumping result to json 8238 1726882393.79537: done dumping result, returning 8238 1726882393.79540: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affc7ec-ae25-54bc-d334-0000000003b0] 8238 1726882393.79542: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b0 8238 1726882393.79621: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b0 8238 1726882393.79627: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 8238 1726882393.79692: no more pending results, returning what we have 8238 1726882393.79695: results queue empty 8238 1726882393.79696: checking for any_errors_fatal 8238 1726882393.79697: done checking for any_errors_fatal 8238 1726882393.79698: checking for max_fail_percentage 8238 1726882393.79699: done checking for max_fail_percentage 8238 1726882393.79700: checking to see if all hosts have failed and the running result is not ok 8238 1726882393.79701: done checking to see if all hosts have failed 8238 1726882393.79702: getting the remaining hosts for this loop 8238 1726882393.79703: done getting the remaining hosts for this loop 8238 1726882393.79707: getting the next task for host managed_node3 8238 1726882393.79714: done getting next task for host managed_node3 8238 1726882393.79717: ^ task is: TASK: Stat profile file 8238 1726882393.79720: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882393.79726: getting variables 8238 1726882393.79728: in VariableManager get_vars() 8238 1726882393.79762: Calling all_inventory to load vars for managed_node3 8238 1726882393.79765: Calling groups_inventory to load vars for managed_node3 8238 1726882393.79769: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882393.79778: Calling all_plugins_play to load vars for managed_node3 8238 1726882393.79782: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882393.79784: Calling groups_plugins_play to load vars for managed_node3 8238 1726882393.81114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882393.82268: done with get_vars() 8238 1726882393.82285: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:13 -0400 (0:00:00.046) 0:00:23.978 ****** 8238 1726882393.82377: entering _queue_task() for managed_node3/stat 8238 1726882393.82711: worker is 1 (out of 1 available) 8238 1726882393.82727: exiting _queue_task() for managed_node3/stat 8238 1726882393.82740: done queuing things up, now waiting for results queue to drain 8238 1726882393.82742: waiting for pending results... 8238 1726882393.83144: running TaskExecutor() for managed_node3/TASK: Stat profile file 8238 1726882393.83167: in run() - task 0affc7ec-ae25-54bc-d334-0000000003b1 8238 1726882393.83190: variable 'ansible_search_path' from source: unknown 8238 1726882393.83198: variable 'ansible_search_path' from source: unknown 8238 1726882393.83246: calling self._execute() 8238 1726882393.83339: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.83358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.83374: variable 'omit' from source: magic vars 8238 1726882393.83793: variable 'ansible_distribution_major_version' from source: facts 8238 1726882393.83811: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882393.83821: variable 'omit' from source: magic vars 8238 1726882393.83881: variable 'omit' from source: magic vars 8238 1726882393.83994: variable 'profile' from source: include params 8238 1726882393.84011: variable 'item' from source: include params 8238 1726882393.84088: variable 'item' from source: include params 8238 1726882393.84121: variable 'omit' from source: magic vars 8238 1726882393.84178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882393.84329: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882393.84332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882393.84335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882393.84338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882393.84340: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882393.84343: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.84345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.84465: Set connection var ansible_connection to ssh 8238 1726882393.84473: Set connection var ansible_shell_type to sh 8238 1726882393.84483: Set connection var ansible_pipelining to False 8238 1726882393.84492: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882393.84501: Set connection var ansible_timeout to 10 8238 1726882393.84513: Set connection var ansible_shell_executable to /bin/sh 8238 1726882393.84544: variable 'ansible_shell_executable' from source: unknown 8238 1726882393.84559: variable 'ansible_connection' from source: unknown 8238 1726882393.84567: variable 'ansible_module_compression' from source: unknown 8238 1726882393.84574: variable 'ansible_shell_type' from source: unknown 8238 1726882393.84580: variable 'ansible_shell_executable' from source: unknown 8238 1726882393.84587: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882393.84594: variable 'ansible_pipelining' from source: unknown 8238 1726882393.84601: variable 'ansible_timeout' from source: unknown 8238 1726882393.84609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882393.84850: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882393.84870: variable 'omit' from source: magic vars 8238 1726882393.84930: starting attempt loop 8238 1726882393.84933: running the handler 8238 1726882393.84936: _low_level_execute_command(): starting 8238 1726882393.84938: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882393.85734: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882393.85823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.85864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882393.85883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882393.85910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.86045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.87840: stdout chunk (state=3): >>>/root <<< 8238 1726882393.88058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882393.88062: stdout chunk (state=3): >>><<< 8238 1726882393.88065: stderr chunk (state=3): >>><<< 8238 1726882393.88088: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882393.88112: _low_level_execute_command(): starting 8238 1726882393.88224: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000 `" && echo ansible-tmp-1726882393.8809707-9148-81318788281000="` echo /root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000 `" ) && sleep 0' 8238 1726882393.88838: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.88902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882393.88931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882393.88948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.89059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.91100: stdout chunk (state=3): >>>ansible-tmp-1726882393.8809707-9148-81318788281000=/root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000 <<< 8238 1726882393.91439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882393.91443: stdout chunk (state=3): >>><<< 8238 1726882393.91445: stderr chunk (state=3): >>><<< 8238 1726882393.91448: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882393.8809707-9148-81318788281000=/root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882393.91451: variable 'ansible_module_compression' from source: unknown 8238 1726882393.91647: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8238 1726882393.91772: variable 'ansible_facts' from source: unknown 8238 1726882393.92118: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/AnsiballZ_stat.py 8238 1726882393.92445: Sending initial data 8238 1726882393.92450: Sent initial data (150 bytes) 8238 1726882393.93697: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.93891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882393.93903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.94021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.95667: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8238 1726882393.95674: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882393.95748: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882393.95830: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp__w0zlwm /root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/AnsiballZ_stat.py <<< 8238 1726882393.95838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/AnsiballZ_stat.py" <<< 8238 1726882393.95912: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp__w0zlwm" to remote "/root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/AnsiballZ_stat.py" <<< 8238 1726882393.95916: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/AnsiballZ_stat.py" <<< 8238 1726882393.96628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882393.96683: stderr chunk (state=3): >>><<< 8238 1726882393.96686: stdout chunk (state=3): >>><<< 8238 1726882393.96706: done transferring module to remote 8238 1726882393.96719: _low_level_execute_command(): starting 8238 1726882393.96728: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/ /root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/AnsiballZ_stat.py && sleep 0' 8238 1726882393.97176: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882393.97179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.97182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882393.97185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.97228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882393.97231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.97321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882393.99158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882393.99200: stderr chunk (state=3): >>><<< 8238 1726882393.99203: stdout chunk (state=3): >>><<< 8238 1726882393.99214: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882393.99226: _low_level_execute_command(): starting 8238 1726882393.99229: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/AnsiballZ_stat.py && sleep 0' 8238 1726882393.99657: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882393.99661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.99674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882393.99728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882393.99741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882393.99831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882394.17520: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8238 1726882394.18787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882394.18852: stderr chunk (state=3): >>><<< 8238 1726882394.18860: stdout chunk (state=3): >>><<< 8238 1726882394.18878: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882394.18903: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882394.18912: _low_level_execute_command(): starting 8238 1726882394.18917: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882393.8809707-9148-81318788281000/ > /dev/null 2>&1 && sleep 0' 8238 1726882394.19388: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882394.19400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.19413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882394.19428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.19489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882394.19492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882394.19494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882394.19573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882394.21459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882394.21501: stderr chunk (state=3): >>><<< 8238 1726882394.21504: stdout chunk (state=3): >>><<< 8238 1726882394.21516: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882394.21530: handler run complete 8238 1726882394.21544: attempt loop complete, returning result 8238 1726882394.21547: _execute() done 8238 1726882394.21549: dumping result to json 8238 1726882394.21554: done dumping result, returning 8238 1726882394.21562: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affc7ec-ae25-54bc-d334-0000000003b1] 8238 1726882394.21572: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b1 8238 1726882394.21672: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b1 8238 1726882394.21675: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8238 1726882394.21737: no more pending results, returning what we have 8238 1726882394.21741: results queue empty 8238 1726882394.21742: checking for any_errors_fatal 8238 1726882394.21749: done checking for any_errors_fatal 8238 1726882394.21749: checking for max_fail_percentage 8238 1726882394.21751: done checking for max_fail_percentage 8238 1726882394.21752: checking to see if all hosts have failed and the running result is not ok 8238 1726882394.21753: done checking to see if all hosts have failed 8238 1726882394.21754: getting the remaining hosts for this loop 8238 1726882394.21756: done getting the remaining hosts for this loop 8238 1726882394.21760: getting the next task for host managed_node3 8238 1726882394.21767: done getting next task for host managed_node3 8238 1726882394.21770: ^ task is: TASK: Set NM profile exist flag based on the profile files 8238 1726882394.21774: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882394.21779: getting variables 8238 1726882394.21781: in VariableManager get_vars() 8238 1726882394.21825: Calling all_inventory to load vars for managed_node3 8238 1726882394.21828: Calling groups_inventory to load vars for managed_node3 8238 1726882394.21831: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882394.21843: Calling all_plugins_play to load vars for managed_node3 8238 1726882394.21846: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882394.21849: Calling groups_plugins_play to load vars for managed_node3 8238 1726882394.22815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882394.24040: done with get_vars() 8238 1726882394.24057: done getting variables 8238 1726882394.24102: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:14 -0400 (0:00:00.417) 0:00:24.396 ****** 8238 1726882394.24129: entering _queue_task() for managed_node3/set_fact 8238 1726882394.24357: worker is 1 (out of 1 available) 8238 1726882394.24373: exiting _queue_task() for managed_node3/set_fact 8238 1726882394.24386: done queuing things up, now waiting for results queue to drain 8238 1726882394.24388: waiting for pending results... 8238 1726882394.24562: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 8238 1726882394.24643: in run() - task 0affc7ec-ae25-54bc-d334-0000000003b2 8238 1726882394.24730: variable 'ansible_search_path' from source: unknown 8238 1726882394.24740: variable 'ansible_search_path' from source: unknown 8238 1726882394.24771: calling self._execute() 8238 1726882394.24925: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882394.24940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882394.24961: variable 'omit' from source: magic vars 8238 1726882394.25265: variable 'ansible_distribution_major_version' from source: facts 8238 1726882394.25275: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882394.25369: variable 'profile_stat' from source: set_fact 8238 1726882394.25380: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882394.25383: when evaluation is False, skipping this task 8238 1726882394.25386: _execute() done 8238 1726882394.25389: dumping result to json 8238 1726882394.25394: done dumping result, returning 8238 1726882394.25399: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affc7ec-ae25-54bc-d334-0000000003b2] 8238 1726882394.25405: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b2 8238 1726882394.25495: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b2 8238 1726882394.25498: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882394.25571: no more pending results, returning what we have 8238 1726882394.25574: results queue empty 8238 1726882394.25575: checking for any_errors_fatal 8238 1726882394.25583: done checking for any_errors_fatal 8238 1726882394.25583: checking for max_fail_percentage 8238 1726882394.25585: done checking for max_fail_percentage 8238 1726882394.25585: checking to see if all hosts have failed and the running result is not ok 8238 1726882394.25586: done checking to see if all hosts have failed 8238 1726882394.25587: getting the remaining hosts for this loop 8238 1726882394.25588: done getting the remaining hosts for this loop 8238 1726882394.25591: getting the next task for host managed_node3 8238 1726882394.25597: done getting next task for host managed_node3 8238 1726882394.25599: ^ task is: TASK: Get NM profile info 8238 1726882394.25603: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882394.25607: getting variables 8238 1726882394.25608: in VariableManager get_vars() 8238 1726882394.25644: Calling all_inventory to load vars for managed_node3 8238 1726882394.25647: Calling groups_inventory to load vars for managed_node3 8238 1726882394.25649: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882394.25659: Calling all_plugins_play to load vars for managed_node3 8238 1726882394.25662: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882394.25665: Calling groups_plugins_play to load vars for managed_node3 8238 1726882394.26547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882394.28464: done with get_vars() 8238 1726882394.28487: done getting variables 8238 1726882394.28549: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:14 -0400 (0:00:00.044) 0:00:24.440 ****** 8238 1726882394.28580: entering _queue_task() for managed_node3/shell 8238 1726882394.28844: worker is 1 (out of 1 available) 8238 1726882394.28858: exiting _queue_task() for managed_node3/shell 8238 1726882394.28870: done queuing things up, now waiting for results queue to drain 8238 1726882394.28872: waiting for pending results... 8238 1726882394.29341: running TaskExecutor() for managed_node3/TASK: Get NM profile info 8238 1726882394.29345: in run() - task 0affc7ec-ae25-54bc-d334-0000000003b3 8238 1726882394.29348: variable 'ansible_search_path' from source: unknown 8238 1726882394.29351: variable 'ansible_search_path' from source: unknown 8238 1726882394.29355: calling self._execute() 8238 1726882394.29434: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882394.29446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882394.29461: variable 'omit' from source: magic vars 8238 1726882394.29862: variable 'ansible_distribution_major_version' from source: facts 8238 1726882394.29880: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882394.29907: variable 'omit' from source: magic vars 8238 1726882394.29963: variable 'omit' from source: magic vars 8238 1726882394.30078: variable 'profile' from source: include params 8238 1726882394.30088: variable 'item' from source: include params 8238 1726882394.30163: variable 'item' from source: include params 8238 1726882394.30186: variable 'omit' from source: magic vars 8238 1726882394.30241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882394.30284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882394.30309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882394.30337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882394.30357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882394.30394: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882394.30453: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882394.30456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882394.30537: Set connection var ansible_connection to ssh 8238 1726882394.30546: Set connection var ansible_shell_type to sh 8238 1726882394.30564: Set connection var ansible_pipelining to False 8238 1726882394.30576: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882394.30588: Set connection var ansible_timeout to 10 8238 1726882394.30602: Set connection var ansible_shell_executable to /bin/sh 8238 1726882394.30633: variable 'ansible_shell_executable' from source: unknown 8238 1726882394.30670: variable 'ansible_connection' from source: unknown 8238 1726882394.30673: variable 'ansible_module_compression' from source: unknown 8238 1726882394.30675: variable 'ansible_shell_type' from source: unknown 8238 1726882394.30678: variable 'ansible_shell_executable' from source: unknown 8238 1726882394.30680: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882394.30682: variable 'ansible_pipelining' from source: unknown 8238 1726882394.30684: variable 'ansible_timeout' from source: unknown 8238 1726882394.30686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882394.30838: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882394.30887: variable 'omit' from source: magic vars 8238 1726882394.30890: starting attempt loop 8238 1726882394.30893: running the handler 8238 1726882394.30895: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882394.30913: _low_level_execute_command(): starting 8238 1726882394.30952: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882394.31683: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882394.31687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.31690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882394.31693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882394.31696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.31761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882394.31846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882394.31973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882394.33634: stdout chunk (state=3): >>>/root <<< 8238 1726882394.33811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882394.33815: stdout chunk (state=3): >>><<< 8238 1726882394.33817: stderr chunk (state=3): >>><<< 8238 1726882394.33935: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882394.33939: _low_level_execute_command(): starting 8238 1726882394.33942: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832 `" && echo ansible-tmp-1726882394.338458-9171-129556255543832="` echo /root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832 `" ) && sleep 0' 8238 1726882394.34886: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.35018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882394.35042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882394.35234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882394.37831: stdout chunk (state=3): >>>ansible-tmp-1726882394.338458-9171-129556255543832=/root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832 <<< 8238 1726882394.37835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882394.37837: stdout chunk (state=3): >>><<< 8238 1726882394.37839: stderr chunk (state=3): >>><<< 8238 1726882394.37842: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882394.338458-9171-129556255543832=/root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882394.37844: variable 'ansible_module_compression' from source: unknown 8238 1726882394.37846: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882394.37848: variable 'ansible_facts' from source: unknown 8238 1726882394.38044: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/AnsiballZ_command.py 8238 1726882394.38350: Sending initial data 8238 1726882394.38353: Sent initial data (153 bytes) 8238 1726882394.39480: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882394.39492: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.39757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882394.39761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882394.39859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882394.41455: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8238 1726882394.41549: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882394.41676: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpjiopiki_ /root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/AnsiballZ_command.py <<< 8238 1726882394.41679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/AnsiballZ_command.py" <<< 8238 1726882394.41725: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpjiopiki_" to remote "/root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/AnsiballZ_command.py" <<< 8238 1726882394.43579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882394.43820: stderr chunk (state=3): >>><<< 8238 1726882394.43827: stdout chunk (state=3): >>><<< 8238 1726882394.43830: done transferring module to remote 8238 1726882394.43832: _low_level_execute_command(): starting 8238 1726882394.43834: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/ /root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/AnsiballZ_command.py && sleep 0' 8238 1726882394.44970: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882394.44983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.44995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882394.45011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.45057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882394.45094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882394.45217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882394.47229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882394.47232: stdout chunk (state=3): >>><<< 8238 1726882394.47235: stderr chunk (state=3): >>><<< 8238 1726882394.47237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882394.47240: _low_level_execute_command(): starting 8238 1726882394.47242: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/AnsiballZ_command.py && sleep 0' 8238 1726882394.48373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882394.48376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882394.48379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.48381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882394.48383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882394.48446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882394.48454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882394.48457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882394.48560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882394.67368: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:14.649313", "end": "2024-09-20 21:33:14.671795", "delta": "0:00:00.022482", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882394.69139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882394.69143: stdout chunk (state=3): >>><<< 8238 1726882394.69146: stderr chunk (state=3): >>><<< 8238 1726882394.69148: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:14.649313", "end": "2024-09-20 21:33:14.671795", "delta": "0:00:00.022482", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882394.69163: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882394.69177: _low_level_execute_command(): starting 8238 1726882394.69186: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882394.338458-9171-129556255543832/ > /dev/null 2>&1 && sleep 0' 8238 1726882394.69855: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882394.69872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882394.69888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882394.69946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882394.69967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882394.70062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882394.70075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882394.70101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882394.70204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882394.72198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882394.72211: stdout chunk (state=3): >>><<< 8238 1726882394.72226: stderr chunk (state=3): >>><<< 8238 1726882394.72250: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882394.72264: handler run complete 8238 1726882394.72428: Evaluated conditional (False): False 8238 1726882394.72431: attempt loop complete, returning result 8238 1726882394.72434: _execute() done 8238 1726882394.72436: dumping result to json 8238 1726882394.72439: done dumping result, returning 8238 1726882394.72441: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affc7ec-ae25-54bc-d334-0000000003b3] 8238 1726882394.72443: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b3 8238 1726882394.72519: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b3 8238 1726882394.72525: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.022482", "end": "2024-09-20 21:33:14.671795", "rc": 0, "start": "2024-09-20 21:33:14.649313" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 8238 1726882394.72609: no more pending results, returning what we have 8238 1726882394.72613: results queue empty 8238 1726882394.72614: checking for any_errors_fatal 8238 1726882394.72620: done checking for any_errors_fatal 8238 1726882394.72621: checking for max_fail_percentage 8238 1726882394.72632: done checking for max_fail_percentage 8238 1726882394.72633: checking to see if all hosts have failed and the running result is not ok 8238 1726882394.72634: done checking to see if all hosts have failed 8238 1726882394.72635: getting the remaining hosts for this loop 8238 1726882394.72637: done getting the remaining hosts for this loop 8238 1726882394.72641: getting the next task for host managed_node3 8238 1726882394.72650: done getting next task for host managed_node3 8238 1726882394.72653: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8238 1726882394.72659: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882394.72663: getting variables 8238 1726882394.72665: in VariableManager get_vars() 8238 1726882394.72710: Calling all_inventory to load vars for managed_node3 8238 1726882394.72714: Calling groups_inventory to load vars for managed_node3 8238 1726882394.72717: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882394.72837: Calling all_plugins_play to load vars for managed_node3 8238 1726882394.72841: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882394.72853: Calling groups_plugins_play to load vars for managed_node3 8238 1726882394.74979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882394.77288: done with get_vars() 8238 1726882394.77313: done getting variables 8238 1726882394.77382: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:14 -0400 (0:00:00.488) 0:00:24.929 ****** 8238 1726882394.77414: entering _queue_task() for managed_node3/set_fact 8238 1726882394.77755: worker is 1 (out of 1 available) 8238 1726882394.77768: exiting _queue_task() for managed_node3/set_fact 8238 1726882394.77782: done queuing things up, now waiting for results queue to drain 8238 1726882394.77783: waiting for pending results... 8238 1726882394.78245: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8238 1726882394.78251: in run() - task 0affc7ec-ae25-54bc-d334-0000000003b4 8238 1726882394.78254: variable 'ansible_search_path' from source: unknown 8238 1726882394.78256: variable 'ansible_search_path' from source: unknown 8238 1726882394.78283: calling self._execute() 8238 1726882394.78390: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882394.78402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882394.78450: variable 'omit' from source: magic vars 8238 1726882394.78833: variable 'ansible_distribution_major_version' from source: facts 8238 1726882394.78850: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882394.79002: variable 'nm_profile_exists' from source: set_fact 8238 1726882394.79021: Evaluated conditional (nm_profile_exists.rc == 0): True 8238 1726882394.79104: variable 'omit' from source: magic vars 8238 1726882394.79108: variable 'omit' from source: magic vars 8238 1726882394.79138: variable 'omit' from source: magic vars 8238 1726882394.79184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882394.79322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882394.79327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882394.79330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882394.79332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882394.79335: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882394.79337: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882394.79339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882394.79456: Set connection var ansible_connection to ssh 8238 1726882394.79527: Set connection var ansible_shell_type to sh 8238 1726882394.79530: Set connection var ansible_pipelining to False 8238 1726882394.79536: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882394.79539: Set connection var ansible_timeout to 10 8238 1726882394.79541: Set connection var ansible_shell_executable to /bin/sh 8238 1726882394.79544: variable 'ansible_shell_executable' from source: unknown 8238 1726882394.79546: variable 'ansible_connection' from source: unknown 8238 1726882394.79548: variable 'ansible_module_compression' from source: unknown 8238 1726882394.79550: variable 'ansible_shell_type' from source: unknown 8238 1726882394.79551: variable 'ansible_shell_executable' from source: unknown 8238 1726882394.79553: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882394.79560: variable 'ansible_pipelining' from source: unknown 8238 1726882394.79568: variable 'ansible_timeout' from source: unknown 8238 1726882394.79576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882394.79731: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882394.79747: variable 'omit' from source: magic vars 8238 1726882394.79762: starting attempt loop 8238 1726882394.79769: running the handler 8238 1726882394.79787: handler run complete 8238 1726882394.79865: attempt loop complete, returning result 8238 1726882394.79869: _execute() done 8238 1726882394.79872: dumping result to json 8238 1726882394.79874: done dumping result, returning 8238 1726882394.79876: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affc7ec-ae25-54bc-d334-0000000003b4] 8238 1726882394.79878: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b4 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 8238 1726882394.80059: no more pending results, returning what we have 8238 1726882394.80063: results queue empty 8238 1726882394.80065: checking for any_errors_fatal 8238 1726882394.80072: done checking for any_errors_fatal 8238 1726882394.80073: checking for max_fail_percentage 8238 1726882394.80074: done checking for max_fail_percentage 8238 1726882394.80081: checking to see if all hosts have failed and the running result is not ok 8238 1726882394.80082: done checking to see if all hosts have failed 8238 1726882394.80084: getting the remaining hosts for this loop 8238 1726882394.80085: done getting the remaining hosts for this loop 8238 1726882394.80089: getting the next task for host managed_node3 8238 1726882394.80099: done getting next task for host managed_node3 8238 1726882394.80102: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 8238 1726882394.80108: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882394.80112: getting variables 8238 1726882394.80114: in VariableManager get_vars() 8238 1726882394.80158: Calling all_inventory to load vars for managed_node3 8238 1726882394.80161: Calling groups_inventory to load vars for managed_node3 8238 1726882394.80164: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882394.80336: Calling all_plugins_play to load vars for managed_node3 8238 1726882394.80340: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882394.80343: Calling groups_plugins_play to load vars for managed_node3 8238 1726882394.80948: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b4 8238 1726882394.80952: WORKER PROCESS EXITING 8238 1726882394.82746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882394.86332: done with get_vars() 8238 1726882394.86358: done getting variables 8238 1726882394.86629: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882394.86956: variable 'profile' from source: include params 8238 1726882394.86960: variable 'item' from source: include params 8238 1726882394.87016: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:14 -0400 (0:00:00.096) 0:00:25.025 ****** 8238 1726882394.87056: entering _queue_task() for managed_node3/command 8238 1726882394.87581: worker is 1 (out of 1 available) 8238 1726882394.87595: exiting _queue_task() for managed_node3/command 8238 1726882394.87609: done queuing things up, now waiting for results queue to drain 8238 1726882394.87610: waiting for pending results... 8238 1726882394.88097: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 8238 1726882394.88204: in run() - task 0affc7ec-ae25-54bc-d334-0000000003b6 8238 1726882394.88218: variable 'ansible_search_path' from source: unknown 8238 1726882394.88224: variable 'ansible_search_path' from source: unknown 8238 1726882394.88465: calling self._execute() 8238 1726882394.88554: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882394.88564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882394.88573: variable 'omit' from source: magic vars 8238 1726882394.89345: variable 'ansible_distribution_major_version' from source: facts 8238 1726882394.89358: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882394.89689: variable 'profile_stat' from source: set_fact 8238 1726882394.89701: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882394.89705: when evaluation is False, skipping this task 8238 1726882394.89708: _execute() done 8238 1726882394.89713: dumping result to json 8238 1726882394.89716: done dumping result, returning 8238 1726882394.89725: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [0affc7ec-ae25-54bc-d334-0000000003b6] 8238 1726882394.89732: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b6 8238 1726882394.89977: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b6 8238 1726882394.89981: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882394.90036: no more pending results, returning what we have 8238 1726882394.90040: results queue empty 8238 1726882394.90041: checking for any_errors_fatal 8238 1726882394.90048: done checking for any_errors_fatal 8238 1726882394.90049: checking for max_fail_percentage 8238 1726882394.90050: done checking for max_fail_percentage 8238 1726882394.90051: checking to see if all hosts have failed and the running result is not ok 8238 1726882394.90052: done checking to see if all hosts have failed 8238 1726882394.90053: getting the remaining hosts for this loop 8238 1726882394.90055: done getting the remaining hosts for this loop 8238 1726882394.90059: getting the next task for host managed_node3 8238 1726882394.90066: done getting next task for host managed_node3 8238 1726882394.90069: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 8238 1726882394.90074: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882394.90078: getting variables 8238 1726882394.90079: in VariableManager get_vars() 8238 1726882394.90119: Calling all_inventory to load vars for managed_node3 8238 1726882394.90124: Calling groups_inventory to load vars for managed_node3 8238 1726882394.90126: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882394.90139: Calling all_plugins_play to load vars for managed_node3 8238 1726882394.90141: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882394.90144: Calling groups_plugins_play to load vars for managed_node3 8238 1726882394.92982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882394.95317: done with get_vars() 8238 1726882394.95625: done getting variables 8238 1726882394.95691: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882394.95812: variable 'profile' from source: include params 8238 1726882394.95817: variable 'item' from source: include params 8238 1726882394.96079: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:14 -0400 (0:00:00.090) 0:00:25.116 ****** 8238 1726882394.96111: entering _queue_task() for managed_node3/set_fact 8238 1726882394.97060: worker is 1 (out of 1 available) 8238 1726882394.97072: exiting _queue_task() for managed_node3/set_fact 8238 1726882394.97083: done queuing things up, now waiting for results queue to drain 8238 1726882394.97085: waiting for pending results... 8238 1726882394.97436: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 8238 1726882394.97750: in run() - task 0affc7ec-ae25-54bc-d334-0000000003b7 8238 1726882394.97755: variable 'ansible_search_path' from source: unknown 8238 1726882394.97758: variable 'ansible_search_path' from source: unknown 8238 1726882394.97762: calling self._execute() 8238 1726882394.97907: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882394.97978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882394.97998: variable 'omit' from source: magic vars 8238 1726882394.98819: variable 'ansible_distribution_major_version' from source: facts 8238 1726882394.99028: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882394.99120: variable 'profile_stat' from source: set_fact 8238 1726882394.99393: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882394.99397: when evaluation is False, skipping this task 8238 1726882394.99399: _execute() done 8238 1726882394.99402: dumping result to json 8238 1726882394.99404: done dumping result, returning 8238 1726882394.99407: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0affc7ec-ae25-54bc-d334-0000000003b7] 8238 1726882394.99409: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b7 8238 1726882394.99485: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b7 8238 1726882394.99488: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882394.99550: no more pending results, returning what we have 8238 1726882394.99554: results queue empty 8238 1726882394.99556: checking for any_errors_fatal 8238 1726882394.99563: done checking for any_errors_fatal 8238 1726882394.99564: checking for max_fail_percentage 8238 1726882394.99565: done checking for max_fail_percentage 8238 1726882394.99566: checking to see if all hosts have failed and the running result is not ok 8238 1726882394.99567: done checking to see if all hosts have failed 8238 1726882394.99568: getting the remaining hosts for this loop 8238 1726882394.99570: done getting the remaining hosts for this loop 8238 1726882394.99575: getting the next task for host managed_node3 8238 1726882394.99585: done getting next task for host managed_node3 8238 1726882394.99588: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 8238 1726882394.99594: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882394.99600: getting variables 8238 1726882394.99602: in VariableManager get_vars() 8238 1726882394.99654: Calling all_inventory to load vars for managed_node3 8238 1726882394.99658: Calling groups_inventory to load vars for managed_node3 8238 1726882394.99661: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882394.99677: Calling all_plugins_play to load vars for managed_node3 8238 1726882394.99680: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882394.99683: Calling groups_plugins_play to load vars for managed_node3 8238 1726882395.03665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882395.05783: done with get_vars() 8238 1726882395.05810: done getting variables 8238 1726882395.06087: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882395.06402: variable 'profile' from source: include params 8238 1726882395.06407: variable 'item' from source: include params 8238 1726882395.06512: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:15 -0400 (0:00:00.104) 0:00:25.220 ****** 8238 1726882395.06546: entering _queue_task() for managed_node3/command 8238 1726882395.07391: worker is 1 (out of 1 available) 8238 1726882395.07406: exiting _queue_task() for managed_node3/command 8238 1726882395.07418: done queuing things up, now waiting for results queue to drain 8238 1726882395.07420: waiting for pending results... 8238 1726882395.08139: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 8238 1726882395.08279: in run() - task 0affc7ec-ae25-54bc-d334-0000000003b8 8238 1726882395.08301: variable 'ansible_search_path' from source: unknown 8238 1726882395.08305: variable 'ansible_search_path' from source: unknown 8238 1726882395.08335: calling self._execute() 8238 1726882395.08557: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.08561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.08564: variable 'omit' from source: magic vars 8238 1726882395.08874: variable 'ansible_distribution_major_version' from source: facts 8238 1726882395.08914: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882395.09027: variable 'profile_stat' from source: set_fact 8238 1726882395.09039: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882395.09042: when evaluation is False, skipping this task 8238 1726882395.09045: _execute() done 8238 1726882395.09047: dumping result to json 8238 1726882395.09052: done dumping result, returning 8238 1726882395.09061: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [0affc7ec-ae25-54bc-d334-0000000003b8] 8238 1726882395.09068: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b8 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882395.09219: no more pending results, returning what we have 8238 1726882395.09226: results queue empty 8238 1726882395.09227: checking for any_errors_fatal 8238 1726882395.09233: done checking for any_errors_fatal 8238 1726882395.09234: checking for max_fail_percentage 8238 1726882395.09236: done checking for max_fail_percentage 8238 1726882395.09237: checking to see if all hosts have failed and the running result is not ok 8238 1726882395.09238: done checking to see if all hosts have failed 8238 1726882395.09239: getting the remaining hosts for this loop 8238 1726882395.09240: done getting the remaining hosts for this loop 8238 1726882395.09244: getting the next task for host managed_node3 8238 1726882395.09255: done getting next task for host managed_node3 8238 1726882395.09258: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 8238 1726882395.09264: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882395.09268: getting variables 8238 1726882395.09270: in VariableManager get_vars() 8238 1726882395.09314: Calling all_inventory to load vars for managed_node3 8238 1726882395.09318: Calling groups_inventory to load vars for managed_node3 8238 1726882395.09320: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882395.09339: Calling all_plugins_play to load vars for managed_node3 8238 1726882395.09342: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882395.09346: Calling groups_plugins_play to load vars for managed_node3 8238 1726882395.09874: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b8 8238 1726882395.09877: WORKER PROCESS EXITING 8238 1726882395.12597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882395.17303: done with get_vars() 8238 1726882395.17335: done getting variables 8238 1726882395.17405: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882395.17735: variable 'profile' from source: include params 8238 1726882395.17739: variable 'item' from source: include params 8238 1726882395.17806: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:15 -0400 (0:00:00.114) 0:00:25.335 ****** 8238 1726882395.18045: entering _queue_task() for managed_node3/set_fact 8238 1726882395.18829: worker is 1 (out of 1 available) 8238 1726882395.18841: exiting _queue_task() for managed_node3/set_fact 8238 1726882395.18855: done queuing things up, now waiting for results queue to drain 8238 1726882395.18857: waiting for pending results... 8238 1726882395.19137: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 8238 1726882395.19395: in run() - task 0affc7ec-ae25-54bc-d334-0000000003b9 8238 1726882395.19439: variable 'ansible_search_path' from source: unknown 8238 1726882395.19635: variable 'ansible_search_path' from source: unknown 8238 1726882395.19639: calling self._execute() 8238 1726882395.19741: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.19825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.19840: variable 'omit' from source: magic vars 8238 1726882395.20731: variable 'ansible_distribution_major_version' from source: facts 8238 1726882395.20736: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882395.20965: variable 'profile_stat' from source: set_fact 8238 1726882395.20986: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882395.20995: when evaluation is False, skipping this task 8238 1726882395.21003: _execute() done 8238 1726882395.21010: dumping result to json 8238 1726882395.21018: done dumping result, returning 8238 1726882395.21032: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [0affc7ec-ae25-54bc-d334-0000000003b9] 8238 1726882395.21257: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b9 8238 1726882395.21333: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003b9 8238 1726882395.21337: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882395.21392: no more pending results, returning what we have 8238 1726882395.21396: results queue empty 8238 1726882395.21397: checking for any_errors_fatal 8238 1726882395.21403: done checking for any_errors_fatal 8238 1726882395.21404: checking for max_fail_percentage 8238 1726882395.21405: done checking for max_fail_percentage 8238 1726882395.21406: checking to see if all hosts have failed and the running result is not ok 8238 1726882395.21407: done checking to see if all hosts have failed 8238 1726882395.21408: getting the remaining hosts for this loop 8238 1726882395.21409: done getting the remaining hosts for this loop 8238 1726882395.21413: getting the next task for host managed_node3 8238 1726882395.21425: done getting next task for host managed_node3 8238 1726882395.21428: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 8238 1726882395.21432: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882395.21438: getting variables 8238 1726882395.21440: in VariableManager get_vars() 8238 1726882395.21489: Calling all_inventory to load vars for managed_node3 8238 1726882395.21492: Calling groups_inventory to load vars for managed_node3 8238 1726882395.21495: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882395.21510: Calling all_plugins_play to load vars for managed_node3 8238 1726882395.21513: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882395.21516: Calling groups_plugins_play to load vars for managed_node3 8238 1726882395.25089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882395.28090: done with get_vars() 8238 1726882395.28128: done getting variables 8238 1726882395.28198: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882395.28329: variable 'profile' from source: include params 8238 1726882395.28338: variable 'item' from source: include params 8238 1726882395.28406: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:15 -0400 (0:00:00.103) 0:00:25.439 ****** 8238 1726882395.28441: entering _queue_task() for managed_node3/assert 8238 1726882395.28941: worker is 1 (out of 1 available) 8238 1726882395.28958: exiting _queue_task() for managed_node3/assert 8238 1726882395.28971: done queuing things up, now waiting for results queue to drain 8238 1726882395.28973: waiting for pending results... 8238 1726882395.29353: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 8238 1726882395.29359: in run() - task 0affc7ec-ae25-54bc-d334-000000000260 8238 1726882395.29362: variable 'ansible_search_path' from source: unknown 8238 1726882395.29366: variable 'ansible_search_path' from source: unknown 8238 1726882395.29404: calling self._execute() 8238 1726882395.29630: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.29633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.29637: variable 'omit' from source: magic vars 8238 1726882395.30028: variable 'ansible_distribution_major_version' from source: facts 8238 1726882395.30040: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882395.30047: variable 'omit' from source: magic vars 8238 1726882395.30103: variable 'omit' from source: magic vars 8238 1726882395.30244: variable 'profile' from source: include params 8238 1726882395.30248: variable 'item' from source: include params 8238 1726882395.30320: variable 'item' from source: include params 8238 1726882395.30343: variable 'omit' from source: magic vars 8238 1726882395.30391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882395.30436: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882395.30462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882395.30483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.30494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.30537: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882395.30541: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.30543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.30678: Set connection var ansible_connection to ssh 8238 1726882395.30682: Set connection var ansible_shell_type to sh 8238 1726882395.30684: Set connection var ansible_pipelining to False 8238 1726882395.30687: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882395.30691: Set connection var ansible_timeout to 10 8238 1726882395.30700: Set connection var ansible_shell_executable to /bin/sh 8238 1726882395.30726: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.30865: variable 'ansible_connection' from source: unknown 8238 1726882395.30869: variable 'ansible_module_compression' from source: unknown 8238 1726882395.30872: variable 'ansible_shell_type' from source: unknown 8238 1726882395.30874: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.30879: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.30929: variable 'ansible_pipelining' from source: unknown 8238 1726882395.30932: variable 'ansible_timeout' from source: unknown 8238 1726882395.30935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.31243: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882395.31258: variable 'omit' from source: magic vars 8238 1726882395.31263: starting attempt loop 8238 1726882395.31266: running the handler 8238 1726882395.31498: variable 'lsr_net_profile_exists' from source: set_fact 8238 1726882395.31513: Evaluated conditional (lsr_net_profile_exists): True 8238 1726882395.31517: handler run complete 8238 1726882395.31663: attempt loop complete, returning result 8238 1726882395.31667: _execute() done 8238 1726882395.31669: dumping result to json 8238 1726882395.31672: done dumping result, returning 8238 1726882395.31675: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [0affc7ec-ae25-54bc-d334-000000000260] 8238 1726882395.31677: sending task result for task 0affc7ec-ae25-54bc-d334-000000000260 8238 1726882395.31799: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000260 8238 1726882395.31802: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882395.31862: no more pending results, returning what we have 8238 1726882395.31866: results queue empty 8238 1726882395.31867: checking for any_errors_fatal 8238 1726882395.31874: done checking for any_errors_fatal 8238 1726882395.31875: checking for max_fail_percentage 8238 1726882395.31877: done checking for max_fail_percentage 8238 1726882395.31879: checking to see if all hosts have failed and the running result is not ok 8238 1726882395.31880: done checking to see if all hosts have failed 8238 1726882395.31881: getting the remaining hosts for this loop 8238 1726882395.31883: done getting the remaining hosts for this loop 8238 1726882395.31887: getting the next task for host managed_node3 8238 1726882395.31894: done getting next task for host managed_node3 8238 1726882395.31897: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 8238 1726882395.31901: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882395.31907: getting variables 8238 1726882395.31909: in VariableManager get_vars() 8238 1726882395.31959: Calling all_inventory to load vars for managed_node3 8238 1726882395.31962: Calling groups_inventory to load vars for managed_node3 8238 1726882395.31966: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882395.31978: Calling all_plugins_play to load vars for managed_node3 8238 1726882395.31981: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882395.31985: Calling groups_plugins_play to load vars for managed_node3 8238 1726882395.40056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882395.42758: done with get_vars() 8238 1726882395.42789: done getting variables 8238 1726882395.42878: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882395.43024: variable 'profile' from source: include params 8238 1726882395.43028: variable 'item' from source: include params 8238 1726882395.43132: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:15 -0400 (0:00:00.147) 0:00:25.586 ****** 8238 1726882395.43167: entering _queue_task() for managed_node3/assert 8238 1726882395.43919: worker is 1 (out of 1 available) 8238 1726882395.44135: exiting _queue_task() for managed_node3/assert 8238 1726882395.44148: done queuing things up, now waiting for results queue to drain 8238 1726882395.44150: waiting for pending results... 8238 1726882395.44556: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 8238 1726882395.44571: in run() - task 0affc7ec-ae25-54bc-d334-000000000261 8238 1726882395.44930: variable 'ansible_search_path' from source: unknown 8238 1726882395.44934: variable 'ansible_search_path' from source: unknown 8238 1726882395.44938: calling self._execute() 8238 1726882395.44941: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.44945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.44949: variable 'omit' from source: magic vars 8238 1726882395.45551: variable 'ansible_distribution_major_version' from source: facts 8238 1726882395.45565: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882395.45572: variable 'omit' from source: magic vars 8238 1726882395.45617: variable 'omit' from source: magic vars 8238 1726882395.45950: variable 'profile' from source: include params 8238 1726882395.45953: variable 'item' from source: include params 8238 1726882395.46112: variable 'item' from source: include params 8238 1726882395.46136: variable 'omit' from source: magic vars 8238 1726882395.46178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882395.46221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882395.46243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882395.46268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.46276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.46332: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882395.46336: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.46339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.46487: Set connection var ansible_connection to ssh 8238 1726882395.46490: Set connection var ansible_shell_type to sh 8238 1726882395.46493: Set connection var ansible_pipelining to False 8238 1726882395.46495: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882395.46497: Set connection var ansible_timeout to 10 8238 1726882395.46500: Set connection var ansible_shell_executable to /bin/sh 8238 1726882395.46512: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.46515: variable 'ansible_connection' from source: unknown 8238 1726882395.46517: variable 'ansible_module_compression' from source: unknown 8238 1726882395.46520: variable 'ansible_shell_type' from source: unknown 8238 1726882395.46528: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.46530: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.46538: variable 'ansible_pipelining' from source: unknown 8238 1726882395.46542: variable 'ansible_timeout' from source: unknown 8238 1726882395.46561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.46813: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882395.46817: variable 'omit' from source: magic vars 8238 1726882395.46820: starting attempt loop 8238 1726882395.46823: running the handler 8238 1726882395.46835: variable 'lsr_net_profile_ansible_managed' from source: set_fact 8238 1726882395.46840: Evaluated conditional (lsr_net_profile_ansible_managed): True 8238 1726882395.46847: handler run complete 8238 1726882395.46868: attempt loop complete, returning result 8238 1726882395.46871: _execute() done 8238 1726882395.46874: dumping result to json 8238 1726882395.46877: done dumping result, returning 8238 1726882395.46884: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [0affc7ec-ae25-54bc-d334-000000000261] 8238 1726882395.46889: sending task result for task 0affc7ec-ae25-54bc-d334-000000000261 8238 1726882395.46988: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000261 8238 1726882395.46991: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882395.47044: no more pending results, returning what we have 8238 1726882395.47048: results queue empty 8238 1726882395.47049: checking for any_errors_fatal 8238 1726882395.47058: done checking for any_errors_fatal 8238 1726882395.47059: checking for max_fail_percentage 8238 1726882395.47061: done checking for max_fail_percentage 8238 1726882395.47062: checking to see if all hosts have failed and the running result is not ok 8238 1726882395.47063: done checking to see if all hosts have failed 8238 1726882395.47064: getting the remaining hosts for this loop 8238 1726882395.47065: done getting the remaining hosts for this loop 8238 1726882395.47070: getting the next task for host managed_node3 8238 1726882395.47077: done getting next task for host managed_node3 8238 1726882395.47079: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 8238 1726882395.47082: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882395.47087: getting variables 8238 1726882395.47089: in VariableManager get_vars() 8238 1726882395.47133: Calling all_inventory to load vars for managed_node3 8238 1726882395.47136: Calling groups_inventory to load vars for managed_node3 8238 1726882395.47139: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882395.47151: Calling all_plugins_play to load vars for managed_node3 8238 1726882395.47153: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882395.47157: Calling groups_plugins_play to load vars for managed_node3 8238 1726882395.49443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882395.51794: done with get_vars() 8238 1726882395.51825: done getting variables 8238 1726882395.51892: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882395.52012: variable 'profile' from source: include params 8238 1726882395.52016: variable 'item' from source: include params 8238 1726882395.52082: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:15 -0400 (0:00:00.089) 0:00:25.676 ****** 8238 1726882395.52118: entering _queue_task() for managed_node3/assert 8238 1726882395.52465: worker is 1 (out of 1 available) 8238 1726882395.52480: exiting _queue_task() for managed_node3/assert 8238 1726882395.52492: done queuing things up, now waiting for results queue to drain 8238 1726882395.52493: waiting for pending results... 8238 1726882395.52828: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 8238 1726882395.52835: in run() - task 0affc7ec-ae25-54bc-d334-000000000262 8238 1726882395.52855: variable 'ansible_search_path' from source: unknown 8238 1726882395.52860: variable 'ansible_search_path' from source: unknown 8238 1726882395.52901: calling self._execute() 8238 1726882395.52999: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.53005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.53015: variable 'omit' from source: magic vars 8238 1726882395.53424: variable 'ansible_distribution_major_version' from source: facts 8238 1726882395.53467: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882395.53470: variable 'omit' from source: magic vars 8238 1726882395.53485: variable 'omit' from source: magic vars 8238 1726882395.53596: variable 'profile' from source: include params 8238 1726882395.53608: variable 'item' from source: include params 8238 1726882395.53685: variable 'item' from source: include params 8238 1726882395.53726: variable 'omit' from source: magic vars 8238 1726882395.53740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882395.53782: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882395.53802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882395.53826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.53903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.53907: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882395.53910: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.53912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.53991: Set connection var ansible_connection to ssh 8238 1726882395.53994: Set connection var ansible_shell_type to sh 8238 1726882395.53999: Set connection var ansible_pipelining to False 8238 1726882395.54010: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882395.54014: Set connection var ansible_timeout to 10 8238 1726882395.54021: Set connection var ansible_shell_executable to /bin/sh 8238 1726882395.54048: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.54052: variable 'ansible_connection' from source: unknown 8238 1726882395.54055: variable 'ansible_module_compression' from source: unknown 8238 1726882395.54060: variable 'ansible_shell_type' from source: unknown 8238 1726882395.54063: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.54065: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.54121: variable 'ansible_pipelining' from source: unknown 8238 1726882395.54127: variable 'ansible_timeout' from source: unknown 8238 1726882395.54130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.54227: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882395.54236: variable 'omit' from source: magic vars 8238 1726882395.54241: starting attempt loop 8238 1726882395.54244: running the handler 8238 1726882395.54376: variable 'lsr_net_profile_fingerprint' from source: set_fact 8238 1726882395.54379: Evaluated conditional (lsr_net_profile_fingerprint): True 8238 1726882395.54427: handler run complete 8238 1726882395.54430: attempt loop complete, returning result 8238 1726882395.54433: _execute() done 8238 1726882395.54435: dumping result to json 8238 1726882395.54438: done dumping result, returning 8238 1726882395.54440: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [0affc7ec-ae25-54bc-d334-000000000262] 8238 1726882395.54448: sending task result for task 0affc7ec-ae25-54bc-d334-000000000262 8238 1726882395.54519: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000262 8238 1726882395.54725: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882395.54769: no more pending results, returning what we have 8238 1726882395.54772: results queue empty 8238 1726882395.54773: checking for any_errors_fatal 8238 1726882395.54778: done checking for any_errors_fatal 8238 1726882395.54779: checking for max_fail_percentage 8238 1726882395.54780: done checking for max_fail_percentage 8238 1726882395.54782: checking to see if all hosts have failed and the running result is not ok 8238 1726882395.54782: done checking to see if all hosts have failed 8238 1726882395.54783: getting the remaining hosts for this loop 8238 1726882395.54785: done getting the remaining hosts for this loop 8238 1726882395.54788: getting the next task for host managed_node3 8238 1726882395.54797: done getting next task for host managed_node3 8238 1726882395.54799: ^ task is: TASK: Include the task 'get_profile_stat.yml' 8238 1726882395.54803: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882395.54807: getting variables 8238 1726882395.54808: in VariableManager get_vars() 8238 1726882395.54844: Calling all_inventory to load vars for managed_node3 8238 1726882395.54847: Calling groups_inventory to load vars for managed_node3 8238 1726882395.54850: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882395.54861: Calling all_plugins_play to load vars for managed_node3 8238 1726882395.54864: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882395.54867: Calling groups_plugins_play to load vars for managed_node3 8238 1726882395.56578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882395.58678: done with get_vars() 8238 1726882395.58704: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:15 -0400 (0:00:00.066) 0:00:25.743 ****** 8238 1726882395.58810: entering _queue_task() for managed_node3/include_tasks 8238 1726882395.59135: worker is 1 (out of 1 available) 8238 1726882395.59148: exiting _queue_task() for managed_node3/include_tasks 8238 1726882395.59165: done queuing things up, now waiting for results queue to drain 8238 1726882395.59167: waiting for pending results... 8238 1726882395.59465: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 8238 1726882395.59569: in run() - task 0affc7ec-ae25-54bc-d334-000000000266 8238 1726882395.59582: variable 'ansible_search_path' from source: unknown 8238 1726882395.59585: variable 'ansible_search_path' from source: unknown 8238 1726882395.59630: calling self._execute() 8238 1726882395.59724: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.59728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.59751: variable 'omit' from source: magic vars 8238 1726882395.60184: variable 'ansible_distribution_major_version' from source: facts 8238 1726882395.60192: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882395.60195: _execute() done 8238 1726882395.60199: dumping result to json 8238 1726882395.60201: done dumping result, returning 8238 1726882395.60204: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affc7ec-ae25-54bc-d334-000000000266] 8238 1726882395.60206: sending task result for task 0affc7ec-ae25-54bc-d334-000000000266 8238 1726882395.60280: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000266 8238 1726882395.60283: WORKER PROCESS EXITING 8238 1726882395.60326: no more pending results, returning what we have 8238 1726882395.60333: in VariableManager get_vars() 8238 1726882395.60388: Calling all_inventory to load vars for managed_node3 8238 1726882395.60392: Calling groups_inventory to load vars for managed_node3 8238 1726882395.60394: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882395.60411: Calling all_plugins_play to load vars for managed_node3 8238 1726882395.60414: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882395.60417: Calling groups_plugins_play to load vars for managed_node3 8238 1726882395.62313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882395.64402: done with get_vars() 8238 1726882395.64429: variable 'ansible_search_path' from source: unknown 8238 1726882395.64431: variable 'ansible_search_path' from source: unknown 8238 1726882395.64475: we have included files to process 8238 1726882395.64476: generating all_blocks data 8238 1726882395.64478: done generating all_blocks data 8238 1726882395.64483: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8238 1726882395.64484: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8238 1726882395.64487: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8238 1726882395.65449: done processing included file 8238 1726882395.65454: iterating over new_blocks loaded from include file 8238 1726882395.65456: in VariableManager get_vars() 8238 1726882395.65476: done with get_vars() 8238 1726882395.65478: filtering new block on tags 8238 1726882395.65506: done filtering new block on tags 8238 1726882395.65509: in VariableManager get_vars() 8238 1726882395.65531: done with get_vars() 8238 1726882395.65533: filtering new block on tags 8238 1726882395.65560: done filtering new block on tags 8238 1726882395.65562: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 8238 1726882395.65567: extending task lists for all hosts with included blocks 8238 1726882395.65765: done extending task lists 8238 1726882395.65767: done processing included files 8238 1726882395.65768: results queue empty 8238 1726882395.65769: checking for any_errors_fatal 8238 1726882395.65772: done checking for any_errors_fatal 8238 1726882395.65773: checking for max_fail_percentage 8238 1726882395.65774: done checking for max_fail_percentage 8238 1726882395.65775: checking to see if all hosts have failed and the running result is not ok 8238 1726882395.65776: done checking to see if all hosts have failed 8238 1726882395.65776: getting the remaining hosts for this loop 8238 1726882395.65778: done getting the remaining hosts for this loop 8238 1726882395.65780: getting the next task for host managed_node3 8238 1726882395.65784: done getting next task for host managed_node3 8238 1726882395.65787: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 8238 1726882395.65790: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882395.65793: getting variables 8238 1726882395.65794: in VariableManager get_vars() 8238 1726882395.65807: Calling all_inventory to load vars for managed_node3 8238 1726882395.65809: Calling groups_inventory to load vars for managed_node3 8238 1726882395.65811: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882395.65817: Calling all_plugins_play to load vars for managed_node3 8238 1726882395.65820: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882395.65824: Calling groups_plugins_play to load vars for managed_node3 8238 1726882395.67362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882395.69589: done with get_vars() 8238 1726882395.69613: done getting variables 8238 1726882395.69658: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:15 -0400 (0:00:00.108) 0:00:25.851 ****** 8238 1726882395.69687: entering _queue_task() for managed_node3/set_fact 8238 1726882395.70666: worker is 1 (out of 1 available) 8238 1726882395.70679: exiting _queue_task() for managed_node3/set_fact 8238 1726882395.70692: done queuing things up, now waiting for results queue to drain 8238 1726882395.70694: waiting for pending results... 8238 1726882395.71146: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 8238 1726882395.71376: in run() - task 0affc7ec-ae25-54bc-d334-0000000003f8 8238 1726882395.71381: variable 'ansible_search_path' from source: unknown 8238 1726882395.71384: variable 'ansible_search_path' from source: unknown 8238 1726882395.71407: calling self._execute() 8238 1726882395.71512: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.71519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.71534: variable 'omit' from source: magic vars 8238 1726882395.72029: variable 'ansible_distribution_major_version' from source: facts 8238 1726882395.72034: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882395.72038: variable 'omit' from source: magic vars 8238 1726882395.72040: variable 'omit' from source: magic vars 8238 1726882395.72085: variable 'omit' from source: magic vars 8238 1726882395.72130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882395.72176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882395.72196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882395.72216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.72229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.72312: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882395.72316: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.72319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.72399: Set connection var ansible_connection to ssh 8238 1726882395.72403: Set connection var ansible_shell_type to sh 8238 1726882395.72408: Set connection var ansible_pipelining to False 8238 1726882395.72415: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882395.72423: Set connection var ansible_timeout to 10 8238 1726882395.72467: Set connection var ansible_shell_executable to /bin/sh 8238 1726882395.72471: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.72474: variable 'ansible_connection' from source: unknown 8238 1726882395.72477: variable 'ansible_module_compression' from source: unknown 8238 1726882395.72479: variable 'ansible_shell_type' from source: unknown 8238 1726882395.72481: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.72483: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.72485: variable 'ansible_pipelining' from source: unknown 8238 1726882395.72488: variable 'ansible_timeout' from source: unknown 8238 1726882395.72490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.72662: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882395.72667: variable 'omit' from source: magic vars 8238 1726882395.72669: starting attempt loop 8238 1726882395.72672: running the handler 8238 1726882395.72728: handler run complete 8238 1726882395.72732: attempt loop complete, returning result 8238 1726882395.72734: _execute() done 8238 1726882395.72737: dumping result to json 8238 1726882395.72739: done dumping result, returning 8238 1726882395.72742: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affc7ec-ae25-54bc-d334-0000000003f8] 8238 1726882395.72744: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003f8 8238 1726882395.72861: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003f8 8238 1726882395.72866: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 8238 1726882395.72933: no more pending results, returning what we have 8238 1726882395.72938: results queue empty 8238 1726882395.72939: checking for any_errors_fatal 8238 1726882395.72941: done checking for any_errors_fatal 8238 1726882395.72942: checking for max_fail_percentage 8238 1726882395.72944: done checking for max_fail_percentage 8238 1726882395.72945: checking to see if all hosts have failed and the running result is not ok 8238 1726882395.72946: done checking to see if all hosts have failed 8238 1726882395.72947: getting the remaining hosts for this loop 8238 1726882395.72949: done getting the remaining hosts for this loop 8238 1726882395.72956: getting the next task for host managed_node3 8238 1726882395.72964: done getting next task for host managed_node3 8238 1726882395.72967: ^ task is: TASK: Stat profile file 8238 1726882395.72973: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882395.72978: getting variables 8238 1726882395.72980: in VariableManager get_vars() 8238 1726882395.73249: Calling all_inventory to load vars for managed_node3 8238 1726882395.73255: Calling groups_inventory to load vars for managed_node3 8238 1726882395.73258: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882395.73269: Calling all_plugins_play to load vars for managed_node3 8238 1726882395.73272: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882395.73276: Calling groups_plugins_play to load vars for managed_node3 8238 1726882395.77357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882395.81758: done with get_vars() 8238 1726882395.81793: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:15 -0400 (0:00:00.124) 0:00:25.976 ****** 8238 1726882395.82109: entering _queue_task() for managed_node3/stat 8238 1726882395.82893: worker is 1 (out of 1 available) 8238 1726882395.82907: exiting _queue_task() for managed_node3/stat 8238 1726882395.82920: done queuing things up, now waiting for results queue to drain 8238 1726882395.82924: waiting for pending results... 8238 1726882395.83644: running TaskExecutor() for managed_node3/TASK: Stat profile file 8238 1726882395.83777: in run() - task 0affc7ec-ae25-54bc-d334-0000000003f9 8238 1726882395.83794: variable 'ansible_search_path' from source: unknown 8238 1726882395.83802: variable 'ansible_search_path' from source: unknown 8238 1726882395.83996: calling self._execute() 8238 1726882395.84099: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.84155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.84188: variable 'omit' from source: magic vars 8238 1726882395.85146: variable 'ansible_distribution_major_version' from source: facts 8238 1726882395.85165: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882395.85182: variable 'omit' from source: magic vars 8238 1726882395.85401: variable 'omit' from source: magic vars 8238 1726882395.85494: variable 'profile' from source: include params 8238 1726882395.85727: variable 'item' from source: include params 8238 1726882395.85732: variable 'item' from source: include params 8238 1726882395.85735: variable 'omit' from source: magic vars 8238 1726882395.85872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882395.85920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882395.85969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882395.86051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.86073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882395.86169: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882395.86219: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.86230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.86472: Set connection var ansible_connection to ssh 8238 1726882395.86487: Set connection var ansible_shell_type to sh 8238 1726882395.86502: Set connection var ansible_pipelining to False 8238 1726882395.86526: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882395.86821: Set connection var ansible_timeout to 10 8238 1726882395.86826: Set connection var ansible_shell_executable to /bin/sh 8238 1726882395.86829: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.86831: variable 'ansible_connection' from source: unknown 8238 1726882395.86834: variable 'ansible_module_compression' from source: unknown 8238 1726882395.86836: variable 'ansible_shell_type' from source: unknown 8238 1726882395.86838: variable 'ansible_shell_executable' from source: unknown 8238 1726882395.86840: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882395.86842: variable 'ansible_pipelining' from source: unknown 8238 1726882395.86845: variable 'ansible_timeout' from source: unknown 8238 1726882395.86847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882395.87183: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882395.87236: variable 'omit' from source: magic vars 8238 1726882395.87265: starting attempt loop 8238 1726882395.87272: running the handler 8238 1726882395.87299: _low_level_execute_command(): starting 8238 1726882395.87376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882395.88893: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882395.89010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882395.89120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882395.89192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882395.90957: stdout chunk (state=3): >>>/root <<< 8238 1726882395.91069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882395.91133: stderr chunk (state=3): >>><<< 8238 1726882395.91136: stdout chunk (state=3): >>><<< 8238 1726882395.91328: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882395.91332: _low_level_execute_command(): starting 8238 1726882395.91336: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302 `" && echo ansible-tmp-1726882395.9116304-9228-153998289293302="` echo /root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302 `" ) && sleep 0' 8238 1726882395.91887: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882395.91890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882395.91893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882395.91896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882395.91898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882395.91907: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882395.91909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882395.91912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882395.91914: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882395.91916: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8238 1726882395.92151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882395.92249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882395.92357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882395.94355: stdout chunk (state=3): >>>ansible-tmp-1726882395.9116304-9228-153998289293302=/root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302 <<< 8238 1726882395.94532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882395.94536: stdout chunk (state=3): >>><<< 8238 1726882395.94538: stderr chunk (state=3): >>><<< 8238 1726882395.94640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882395.9116304-9228-153998289293302=/root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882395.94643: variable 'ansible_module_compression' from source: unknown 8238 1726882395.94664: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8238 1726882395.94712: variable 'ansible_facts' from source: unknown 8238 1726882395.94803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/AnsiballZ_stat.py 8238 1726882395.95008: Sending initial data 8238 1726882395.95012: Sent initial data (151 bytes) 8238 1726882395.96436: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882395.96544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882395.96625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882395.96736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882395.98411: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 8238 1726882395.98424: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882395.98651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882395.98838: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmppl183bn9 /root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/AnsiballZ_stat.py <<< 8238 1726882395.98842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmppl183bn9" to remote "/root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/AnsiballZ_stat.py" <<< 8238 1726882396.00294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882396.00592: stderr chunk (state=3): >>><<< 8238 1726882396.00595: stdout chunk (state=3): >>><<< 8238 1726882396.00620: done transferring module to remote 8238 1726882396.00635: _low_level_execute_command(): starting 8238 1726882396.00640: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/ /root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/AnsiballZ_stat.py && sleep 0' 8238 1726882396.01936: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882396.02041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882396.02158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882396.04424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882396.04430: stdout chunk (state=3): >>><<< 8238 1726882396.04434: stderr chunk (state=3): >>><<< 8238 1726882396.04437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882396.04440: _low_level_execute_command(): starting 8238 1726882396.04442: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/AnsiballZ_stat.py && sleep 0' 8238 1726882396.05604: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882396.05609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882396.05611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882396.05627: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882396.05669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.05898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882396.05916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882396.06063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882396.22637: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8238 1726882396.24529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882396.24533: stdout chunk (state=3): >>><<< 8238 1726882396.24536: stderr chunk (state=3): >>><<< 8238 1726882396.24539: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882396.24543: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882396.24546: _low_level_execute_command(): starting 8238 1726882396.24549: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882395.9116304-9228-153998289293302/ > /dev/null 2>&1 && sleep 0' 8238 1726882396.25485: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882396.25501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882396.25518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882396.25547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882396.25600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.25673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882396.25712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882396.25835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882396.27863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882396.27867: stdout chunk (state=3): >>><<< 8238 1726882396.27869: stderr chunk (state=3): >>><<< 8238 1726882396.27928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882396.27931: handler run complete 8238 1726882396.27933: attempt loop complete, returning result 8238 1726882396.27935: _execute() done 8238 1726882396.27939: dumping result to json 8238 1726882396.27947: done dumping result, returning 8238 1726882396.27962: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affc7ec-ae25-54bc-d334-0000000003f9] 8238 1726882396.27973: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003f9 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8238 1726882396.28384: no more pending results, returning what we have 8238 1726882396.28387: results queue empty 8238 1726882396.28388: checking for any_errors_fatal 8238 1726882396.28394: done checking for any_errors_fatal 8238 1726882396.28395: checking for max_fail_percentage 8238 1726882396.28397: done checking for max_fail_percentage 8238 1726882396.28397: checking to see if all hosts have failed and the running result is not ok 8238 1726882396.28398: done checking to see if all hosts have failed 8238 1726882396.28399: getting the remaining hosts for this loop 8238 1726882396.28401: done getting the remaining hosts for this loop 8238 1726882396.28404: getting the next task for host managed_node3 8238 1726882396.28412: done getting next task for host managed_node3 8238 1726882396.28415: ^ task is: TASK: Set NM profile exist flag based on the profile files 8238 1726882396.28419: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882396.28425: getting variables 8238 1726882396.28427: in VariableManager get_vars() 8238 1726882396.28468: Calling all_inventory to load vars for managed_node3 8238 1726882396.28471: Calling groups_inventory to load vars for managed_node3 8238 1726882396.28474: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882396.28485: Calling all_plugins_play to load vars for managed_node3 8238 1726882396.28488: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882396.28491: Calling groups_plugins_play to load vars for managed_node3 8238 1726882396.29039: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003f9 8238 1726882396.29043: WORKER PROCESS EXITING 8238 1726882396.30635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882396.32489: done with get_vars() 8238 1726882396.32515: done getting variables 8238 1726882396.32582: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:16 -0400 (0:00:00.505) 0:00:26.481 ****** 8238 1726882396.32627: entering _queue_task() for managed_node3/set_fact 8238 1726882396.33074: worker is 1 (out of 1 available) 8238 1726882396.33088: exiting _queue_task() for managed_node3/set_fact 8238 1726882396.33101: done queuing things up, now waiting for results queue to drain 8238 1726882396.33102: waiting for pending results... 8238 1726882396.33393: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 8238 1726882396.33508: in run() - task 0affc7ec-ae25-54bc-d334-0000000003fa 8238 1726882396.33524: variable 'ansible_search_path' from source: unknown 8238 1726882396.33529: variable 'ansible_search_path' from source: unknown 8238 1726882396.33571: calling self._execute() 8238 1726882396.33786: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882396.33789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882396.33793: variable 'omit' from source: magic vars 8238 1726882396.34107: variable 'ansible_distribution_major_version' from source: facts 8238 1726882396.34117: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882396.34255: variable 'profile_stat' from source: set_fact 8238 1726882396.34266: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882396.34269: when evaluation is False, skipping this task 8238 1726882396.34272: _execute() done 8238 1726882396.34274: dumping result to json 8238 1726882396.34279: done dumping result, returning 8238 1726882396.34285: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affc7ec-ae25-54bc-d334-0000000003fa] 8238 1726882396.34291: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003fa 8238 1726882396.34392: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003fa 8238 1726882396.34396: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882396.34476: no more pending results, returning what we have 8238 1726882396.34480: results queue empty 8238 1726882396.34482: checking for any_errors_fatal 8238 1726882396.34494: done checking for any_errors_fatal 8238 1726882396.34495: checking for max_fail_percentage 8238 1726882396.34497: done checking for max_fail_percentage 8238 1726882396.34498: checking to see if all hosts have failed and the running result is not ok 8238 1726882396.34499: done checking to see if all hosts have failed 8238 1726882396.34499: getting the remaining hosts for this loop 8238 1726882396.34501: done getting the remaining hosts for this loop 8238 1726882396.34505: getting the next task for host managed_node3 8238 1726882396.34512: done getting next task for host managed_node3 8238 1726882396.34514: ^ task is: TASK: Get NM profile info 8238 1726882396.34521: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882396.34528: getting variables 8238 1726882396.34530: in VariableManager get_vars() 8238 1726882396.34573: Calling all_inventory to load vars for managed_node3 8238 1726882396.34577: Calling groups_inventory to load vars for managed_node3 8238 1726882396.34579: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882396.34597: Calling all_plugins_play to load vars for managed_node3 8238 1726882396.34600: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882396.34604: Calling groups_plugins_play to load vars for managed_node3 8238 1726882396.36195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882396.37484: done with get_vars() 8238 1726882396.37500: done getting variables 8238 1726882396.37549: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:16 -0400 (0:00:00.049) 0:00:26.530 ****** 8238 1726882396.37573: entering _queue_task() for managed_node3/shell 8238 1726882396.37815: worker is 1 (out of 1 available) 8238 1726882396.37831: exiting _queue_task() for managed_node3/shell 8238 1726882396.37845: done queuing things up, now waiting for results queue to drain 8238 1726882396.37846: waiting for pending results... 8238 1726882396.38018: running TaskExecutor() for managed_node3/TASK: Get NM profile info 8238 1726882396.38128: in run() - task 0affc7ec-ae25-54bc-d334-0000000003fb 8238 1726882396.38133: variable 'ansible_search_path' from source: unknown 8238 1726882396.38136: variable 'ansible_search_path' from source: unknown 8238 1726882396.38198: calling self._execute() 8238 1726882396.38331: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882396.38335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882396.38341: variable 'omit' from source: magic vars 8238 1726882396.38696: variable 'ansible_distribution_major_version' from source: facts 8238 1726882396.38728: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882396.38732: variable 'omit' from source: magic vars 8238 1726882396.38860: variable 'omit' from source: magic vars 8238 1726882396.38904: variable 'profile' from source: include params 8238 1726882396.38915: variable 'item' from source: include params 8238 1726882396.38993: variable 'item' from source: include params 8238 1726882396.39018: variable 'omit' from source: magic vars 8238 1726882396.39071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882396.39132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882396.39154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882396.39172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882396.39186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882396.39232: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882396.39236: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882396.39238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882396.39318: Set connection var ansible_connection to ssh 8238 1726882396.39321: Set connection var ansible_shell_type to sh 8238 1726882396.39326: Set connection var ansible_pipelining to False 8238 1726882396.39332: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882396.39338: Set connection var ansible_timeout to 10 8238 1726882396.39345: Set connection var ansible_shell_executable to /bin/sh 8238 1726882396.39365: variable 'ansible_shell_executable' from source: unknown 8238 1726882396.39368: variable 'ansible_connection' from source: unknown 8238 1726882396.39370: variable 'ansible_module_compression' from source: unknown 8238 1726882396.39373: variable 'ansible_shell_type' from source: unknown 8238 1726882396.39375: variable 'ansible_shell_executable' from source: unknown 8238 1726882396.39377: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882396.39382: variable 'ansible_pipelining' from source: unknown 8238 1726882396.39385: variable 'ansible_timeout' from source: unknown 8238 1726882396.39389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882396.39502: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882396.39517: variable 'omit' from source: magic vars 8238 1726882396.39520: starting attempt loop 8238 1726882396.39525: running the handler 8238 1726882396.39531: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882396.39547: _low_level_execute_command(): starting 8238 1726882396.39555: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882396.40059: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882396.40064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.40068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882396.40071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.40108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882396.40117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882396.40228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882396.41990: stdout chunk (state=3): >>>/root <<< 8238 1726882396.42128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882396.42158: stderr chunk (state=3): >>><<< 8238 1726882396.42160: stdout chunk (state=3): >>><<< 8238 1726882396.42173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882396.42187: _low_level_execute_command(): starting 8238 1726882396.42228: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159 `" && echo ansible-tmp-1726882396.421776-9272-73280635289159="` echo /root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159 `" ) && sleep 0' 8238 1726882396.42592: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882396.42603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882396.42631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882396.42636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.42693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882396.42704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882396.42802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882396.44792: stdout chunk (state=3): >>>ansible-tmp-1726882396.421776-9272-73280635289159=/root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159 <<< 8238 1726882396.44911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882396.44957: stderr chunk (state=3): >>><<< 8238 1726882396.44960: stdout chunk (state=3): >>><<< 8238 1726882396.44971: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882396.421776-9272-73280635289159=/root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882396.44995: variable 'ansible_module_compression' from source: unknown 8238 1726882396.45036: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882396.45072: variable 'ansible_facts' from source: unknown 8238 1726882396.45128: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/AnsiballZ_command.py 8238 1726882396.45225: Sending initial data 8238 1726882396.45228: Sent initial data (152 bytes) 8238 1726882396.45665: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882396.45668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882396.45671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882396.45673: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882396.45675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.45721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882396.45726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882396.45813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882396.47413: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 8238 1726882396.47416: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882396.47491: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882396.47571: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpqfsq90qo /root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/AnsiballZ_command.py <<< 8238 1726882396.47579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/AnsiballZ_command.py" <<< 8238 1726882396.47655: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpqfsq90qo" to remote "/root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/AnsiballZ_command.py" <<< 8238 1726882396.48628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882396.48647: stderr chunk (state=3): >>><<< 8238 1726882396.48660: stdout chunk (state=3): >>><<< 8238 1726882396.48677: done transferring module to remote 8238 1726882396.48686: _low_level_execute_command(): starting 8238 1726882396.48690: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/ /root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/AnsiballZ_command.py && sleep 0' 8238 1726882396.49102: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882396.49110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882396.49128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.49146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882396.49149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.49198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882396.49203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882396.49286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882396.51330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882396.51334: stdout chunk (state=3): >>><<< 8238 1726882396.51336: stderr chunk (state=3): >>><<< 8238 1726882396.51339: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882396.51342: _low_level_execute_command(): starting 8238 1726882396.51344: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/AnsiballZ_command.py && sleep 0' 8238 1726882396.51746: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882396.51757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882396.51767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882396.51808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.51854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882396.51873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882396.51957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882396.71000: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:16.685368", "end": "2024-09-20 21:33:16.708345", "delta": "0:00:00.022977", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882396.72609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882396.72679: stderr chunk (state=3): >>><<< 8238 1726882396.72683: stdout chunk (state=3): >>><<< 8238 1726882396.72699: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:16.685368", "end": "2024-09-20 21:33:16.708345", "delta": "0:00:00.022977", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882396.72732: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882396.72742: _low_level_execute_command(): starting 8238 1726882396.72747: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882396.421776-9272-73280635289159/ > /dev/null 2>&1 && sleep 0' 8238 1726882396.73210: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882396.73214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882396.73247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882396.73250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882396.73255: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882396.73258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882396.73313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882396.73320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882396.73325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882396.73406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882396.75398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882396.75410: stderr chunk (state=3): >>><<< 8238 1726882396.75532: stdout chunk (state=3): >>><<< 8238 1726882396.75550: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882396.75561: handler run complete 8238 1726882396.75588: Evaluated conditional (False): False 8238 1726882396.75599: attempt loop complete, returning result 8238 1726882396.75602: _execute() done 8238 1726882396.75605: dumping result to json 8238 1726882396.75615: done dumping result, returning 8238 1726882396.75827: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affc7ec-ae25-54bc-d334-0000000003fb] 8238 1726882396.75830: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003fb 8238 1726882396.75904: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003fb 8238 1726882396.75908: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.022977", "end": "2024-09-20 21:33:16.708345", "rc": 0, "start": "2024-09-20 21:33:16.685368" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 8238 1726882396.76145: no more pending results, returning what we have 8238 1726882396.76149: results queue empty 8238 1726882396.76151: checking for any_errors_fatal 8238 1726882396.76159: done checking for any_errors_fatal 8238 1726882396.76160: checking for max_fail_percentage 8238 1726882396.76162: done checking for max_fail_percentage 8238 1726882396.76163: checking to see if all hosts have failed and the running result is not ok 8238 1726882396.76164: done checking to see if all hosts have failed 8238 1726882396.76165: getting the remaining hosts for this loop 8238 1726882396.76167: done getting the remaining hosts for this loop 8238 1726882396.76172: getting the next task for host managed_node3 8238 1726882396.76180: done getting next task for host managed_node3 8238 1726882396.76183: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8238 1726882396.76188: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882396.76198: getting variables 8238 1726882396.76203: in VariableManager get_vars() 8238 1726882396.76487: Calling all_inventory to load vars for managed_node3 8238 1726882396.76491: Calling groups_inventory to load vars for managed_node3 8238 1726882396.76494: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882396.76505: Calling all_plugins_play to load vars for managed_node3 8238 1726882396.76508: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882396.76512: Calling groups_plugins_play to load vars for managed_node3 8238 1726882396.79543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882396.82691: done with get_vars() 8238 1726882396.82720: done getting variables 8238 1726882396.82786: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:16 -0400 (0:00:00.452) 0:00:26.983 ****** 8238 1726882396.82820: entering _queue_task() for managed_node3/set_fact 8238 1726882396.83579: worker is 1 (out of 1 available) 8238 1726882396.83596: exiting _queue_task() for managed_node3/set_fact 8238 1726882396.83612: done queuing things up, now waiting for results queue to drain 8238 1726882396.83614: waiting for pending results... 8238 1726882396.84062: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8238 1726882396.84429: in run() - task 0affc7ec-ae25-54bc-d334-0000000003fc 8238 1726882396.84434: variable 'ansible_search_path' from source: unknown 8238 1726882396.84437: variable 'ansible_search_path' from source: unknown 8238 1726882396.84444: calling self._execute() 8238 1726882396.84748: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882396.84755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882396.84770: variable 'omit' from source: magic vars 8238 1726882396.85690: variable 'ansible_distribution_major_version' from source: facts 8238 1726882396.85793: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882396.85991: variable 'nm_profile_exists' from source: set_fact 8238 1726882396.86008: Evaluated conditional (nm_profile_exists.rc == 0): True 8238 1726882396.86129: variable 'omit' from source: magic vars 8238 1726882396.86190: variable 'omit' from source: magic vars 8238 1726882396.86225: variable 'omit' from source: magic vars 8238 1726882396.86457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882396.86627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882396.86632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882396.86649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882396.86665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882396.86697: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882396.86701: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882396.86704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882396.86932: Set connection var ansible_connection to ssh 8238 1726882396.86936: Set connection var ansible_shell_type to sh 8238 1726882396.86941: Set connection var ansible_pipelining to False 8238 1726882396.87035: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882396.87043: Set connection var ansible_timeout to 10 8238 1726882396.87052: Set connection var ansible_shell_executable to /bin/sh 8238 1726882396.87083: variable 'ansible_shell_executable' from source: unknown 8238 1726882396.87086: variable 'ansible_connection' from source: unknown 8238 1726882396.87089: variable 'ansible_module_compression' from source: unknown 8238 1726882396.87091: variable 'ansible_shell_type' from source: unknown 8238 1726882396.87094: variable 'ansible_shell_executable' from source: unknown 8238 1726882396.87096: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882396.87101: variable 'ansible_pipelining' from source: unknown 8238 1726882396.87104: variable 'ansible_timeout' from source: unknown 8238 1726882396.87182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882396.87466: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882396.87477: variable 'omit' from source: magic vars 8238 1726882396.87483: starting attempt loop 8238 1726882396.87486: running the handler 8238 1726882396.87617: handler run complete 8238 1726882396.87630: attempt loop complete, returning result 8238 1726882396.87634: _execute() done 8238 1726882396.87637: dumping result to json 8238 1726882396.87639: done dumping result, returning 8238 1726882396.87649: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affc7ec-ae25-54bc-d334-0000000003fc] 8238 1726882396.87657: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003fc ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 8238 1726882396.87915: no more pending results, returning what we have 8238 1726882396.87919: results queue empty 8238 1726882396.87920: checking for any_errors_fatal 8238 1726882396.87933: done checking for any_errors_fatal 8238 1726882396.87934: checking for max_fail_percentage 8238 1726882396.87936: done checking for max_fail_percentage 8238 1726882396.87937: checking to see if all hosts have failed and the running result is not ok 8238 1726882396.87938: done checking to see if all hosts have failed 8238 1726882396.87939: getting the remaining hosts for this loop 8238 1726882396.87941: done getting the remaining hosts for this loop 8238 1726882396.87945: getting the next task for host managed_node3 8238 1726882396.87958: done getting next task for host managed_node3 8238 1726882396.87961: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 8238 1726882396.87968: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882396.87973: getting variables 8238 1726882396.87975: in VariableManager get_vars() 8238 1726882396.88020: Calling all_inventory to load vars for managed_node3 8238 1726882396.88126: Calling groups_inventory to load vars for managed_node3 8238 1726882396.88130: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882396.88141: Calling all_plugins_play to load vars for managed_node3 8238 1726882396.88144: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882396.88147: Calling groups_plugins_play to load vars for managed_node3 8238 1726882396.88945: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003fc 8238 1726882396.88949: WORKER PROCESS EXITING 8238 1726882396.91763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882396.95811: done with get_vars() 8238 1726882396.95849: done getting variables 8238 1726882396.95912: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882396.96238: variable 'profile' from source: include params 8238 1726882396.96242: variable 'item' from source: include params 8238 1726882396.96304: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:16 -0400 (0:00:00.137) 0:00:27.120 ****** 8238 1726882396.96547: entering _queue_task() for managed_node3/command 8238 1726882396.97102: worker is 1 (out of 1 available) 8238 1726882396.97116: exiting _queue_task() for managed_node3/command 8238 1726882396.97330: done queuing things up, now waiting for results queue to drain 8238 1726882396.97333: waiting for pending results... 8238 1726882396.97727: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 8238 1726882396.98040: in run() - task 0affc7ec-ae25-54bc-d334-0000000003fe 8238 1726882396.98055: variable 'ansible_search_path' from source: unknown 8238 1726882396.98062: variable 'ansible_search_path' from source: unknown 8238 1726882396.98101: calling self._execute() 8238 1726882396.98200: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882396.98206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882396.98218: variable 'omit' from source: magic vars 8238 1726882396.98926: variable 'ansible_distribution_major_version' from source: facts 8238 1726882396.99145: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882396.99276: variable 'profile_stat' from source: set_fact 8238 1726882396.99291: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882396.99295: when evaluation is False, skipping this task 8238 1726882396.99298: _execute() done 8238 1726882396.99301: dumping result to json 8238 1726882396.99303: done dumping result, returning 8238 1726882396.99313: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0affc7ec-ae25-54bc-d334-0000000003fe] 8238 1726882396.99318: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003fe 8238 1726882396.99828: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003fe 8238 1726882396.99832: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882396.99889: no more pending results, returning what we have 8238 1726882396.99893: results queue empty 8238 1726882396.99894: checking for any_errors_fatal 8238 1726882396.99900: done checking for any_errors_fatal 8238 1726882396.99901: checking for max_fail_percentage 8238 1726882396.99903: done checking for max_fail_percentage 8238 1726882396.99904: checking to see if all hosts have failed and the running result is not ok 8238 1726882396.99905: done checking to see if all hosts have failed 8238 1726882396.99905: getting the remaining hosts for this loop 8238 1726882396.99907: done getting the remaining hosts for this loop 8238 1726882396.99911: getting the next task for host managed_node3 8238 1726882396.99919: done getting next task for host managed_node3 8238 1726882396.99925: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 8238 1726882396.99930: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882396.99935: getting variables 8238 1726882396.99936: in VariableManager get_vars() 8238 1726882396.99984: Calling all_inventory to load vars for managed_node3 8238 1726882396.99987: Calling groups_inventory to load vars for managed_node3 8238 1726882396.99990: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882397.00004: Calling all_plugins_play to load vars for managed_node3 8238 1726882397.00008: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882397.00011: Calling groups_plugins_play to load vars for managed_node3 8238 1726882397.03487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882397.07962: done with get_vars() 8238 1726882397.07992: done getting variables 8238 1726882397.08262: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882397.08383: variable 'profile' from source: include params 8238 1726882397.08387: variable 'item' from source: include params 8238 1726882397.08652: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:17 -0400 (0:00:00.121) 0:00:27.241 ****** 8238 1726882397.08686: entering _queue_task() for managed_node3/set_fact 8238 1726882397.09605: worker is 1 (out of 1 available) 8238 1726882397.09623: exiting _queue_task() for managed_node3/set_fact 8238 1726882397.09638: done queuing things up, now waiting for results queue to drain 8238 1726882397.09640: waiting for pending results... 8238 1726882397.10209: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 8238 1726882397.10429: in run() - task 0affc7ec-ae25-54bc-d334-0000000003ff 8238 1726882397.10534: variable 'ansible_search_path' from source: unknown 8238 1726882397.10540: variable 'ansible_search_path' from source: unknown 8238 1726882397.10578: calling self._execute() 8238 1726882397.10677: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.10684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.10696: variable 'omit' from source: magic vars 8238 1726882397.11483: variable 'ansible_distribution_major_version' from source: facts 8238 1726882397.11495: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882397.11832: variable 'profile_stat' from source: set_fact 8238 1726882397.11846: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882397.11849: when evaluation is False, skipping this task 8238 1726882397.11858: _execute() done 8238 1726882397.11861: dumping result to json 8238 1726882397.11863: done dumping result, returning 8238 1726882397.11870: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0affc7ec-ae25-54bc-d334-0000000003ff] 8238 1726882397.11875: sending task result for task 0affc7ec-ae25-54bc-d334-0000000003ff 8238 1726882397.12329: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000003ff 8238 1726882397.12332: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882397.12417: no more pending results, returning what we have 8238 1726882397.12420: results queue empty 8238 1726882397.12424: checking for any_errors_fatal 8238 1726882397.12431: done checking for any_errors_fatal 8238 1726882397.12431: checking for max_fail_percentage 8238 1726882397.12433: done checking for max_fail_percentage 8238 1726882397.12434: checking to see if all hosts have failed and the running result is not ok 8238 1726882397.12435: done checking to see if all hosts have failed 8238 1726882397.12436: getting the remaining hosts for this loop 8238 1726882397.12438: done getting the remaining hosts for this loop 8238 1726882397.12442: getting the next task for host managed_node3 8238 1726882397.12450: done getting next task for host managed_node3 8238 1726882397.12453: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 8238 1726882397.12458: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882397.12462: getting variables 8238 1726882397.12464: in VariableManager get_vars() 8238 1726882397.12508: Calling all_inventory to load vars for managed_node3 8238 1726882397.12512: Calling groups_inventory to load vars for managed_node3 8238 1726882397.12514: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882397.12828: Calling all_plugins_play to load vars for managed_node3 8238 1726882397.12833: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882397.12837: Calling groups_plugins_play to load vars for managed_node3 8238 1726882397.16900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882397.21255: done with get_vars() 8238 1726882397.21288: done getting variables 8238 1726882397.21360: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882397.21711: variable 'profile' from source: include params 8238 1726882397.21715: variable 'item' from source: include params 8238 1726882397.21782: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:17 -0400 (0:00:00.131) 0:00:27.373 ****** 8238 1726882397.21814: entering _queue_task() for managed_node3/command 8238 1726882397.22578: worker is 1 (out of 1 available) 8238 1726882397.22595: exiting _queue_task() for managed_node3/command 8238 1726882397.22610: done queuing things up, now waiting for results queue to drain 8238 1726882397.22612: waiting for pending results... 8238 1726882397.23088: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 8238 1726882397.23441: in run() - task 0affc7ec-ae25-54bc-d334-000000000400 8238 1726882397.23454: variable 'ansible_search_path' from source: unknown 8238 1726882397.23462: variable 'ansible_search_path' from source: unknown 8238 1726882397.23501: calling self._execute() 8238 1726882397.23707: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.23714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.23863: variable 'omit' from source: magic vars 8238 1726882397.24609: variable 'ansible_distribution_major_version' from source: facts 8238 1726882397.24734: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882397.24993: variable 'profile_stat' from source: set_fact 8238 1726882397.24997: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882397.25000: when evaluation is False, skipping this task 8238 1726882397.25003: _execute() done 8238 1726882397.25005: dumping result to json 8238 1726882397.25028: done dumping result, returning 8238 1726882397.25031: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0affc7ec-ae25-54bc-d334-000000000400] 8238 1726882397.25034: sending task result for task 0affc7ec-ae25-54bc-d334-000000000400 8238 1726882397.25175: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000400 8238 1726882397.25178: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882397.25265: no more pending results, returning what we have 8238 1726882397.25270: results queue empty 8238 1726882397.25271: checking for any_errors_fatal 8238 1726882397.25282: done checking for any_errors_fatal 8238 1726882397.25282: checking for max_fail_percentage 8238 1726882397.25284: done checking for max_fail_percentage 8238 1726882397.25285: checking to see if all hosts have failed and the running result is not ok 8238 1726882397.25286: done checking to see if all hosts have failed 8238 1726882397.25287: getting the remaining hosts for this loop 8238 1726882397.25288: done getting the remaining hosts for this loop 8238 1726882397.25293: getting the next task for host managed_node3 8238 1726882397.25302: done getting next task for host managed_node3 8238 1726882397.25305: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 8238 1726882397.25312: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882397.25316: getting variables 8238 1726882397.25318: in VariableManager get_vars() 8238 1726882397.25363: Calling all_inventory to load vars for managed_node3 8238 1726882397.25367: Calling groups_inventory to load vars for managed_node3 8238 1726882397.25369: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882397.25382: Calling all_plugins_play to load vars for managed_node3 8238 1726882397.25385: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882397.25387: Calling groups_plugins_play to load vars for managed_node3 8238 1726882397.28802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882397.33136: done with get_vars() 8238 1726882397.33172: done getting variables 8238 1726882397.33446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882397.33563: variable 'profile' from source: include params 8238 1726882397.33568: variable 'item' from source: include params 8238 1726882397.33836: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:17 -0400 (0:00:00.120) 0:00:27.493 ****** 8238 1726882397.33873: entering _queue_task() for managed_node3/set_fact 8238 1726882397.34475: worker is 1 (out of 1 available) 8238 1726882397.34488: exiting _queue_task() for managed_node3/set_fact 8238 1726882397.34499: done queuing things up, now waiting for results queue to drain 8238 1726882397.34501: waiting for pending results... 8238 1726882397.34943: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 8238 1726882397.35181: in run() - task 0affc7ec-ae25-54bc-d334-000000000401 8238 1726882397.35267: variable 'ansible_search_path' from source: unknown 8238 1726882397.35271: variable 'ansible_search_path' from source: unknown 8238 1726882397.35310: calling self._execute() 8238 1726882397.35505: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.35509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.35511: variable 'omit' from source: magic vars 8238 1726882397.36354: variable 'ansible_distribution_major_version' from source: facts 8238 1726882397.36358: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882397.36534: variable 'profile_stat' from source: set_fact 8238 1726882397.36552: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882397.36555: when evaluation is False, skipping this task 8238 1726882397.36558: _execute() done 8238 1726882397.36561: dumping result to json 8238 1726882397.36567: done dumping result, returning 8238 1726882397.36573: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0affc7ec-ae25-54bc-d334-000000000401] 8238 1726882397.36579: sending task result for task 0affc7ec-ae25-54bc-d334-000000000401 8238 1726882397.36861: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000401 8238 1726882397.36865: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882397.36902: no more pending results, returning what we have 8238 1726882397.36905: results queue empty 8238 1726882397.36906: checking for any_errors_fatal 8238 1726882397.36910: done checking for any_errors_fatal 8238 1726882397.36911: checking for max_fail_percentage 8238 1726882397.36912: done checking for max_fail_percentage 8238 1726882397.36913: checking to see if all hosts have failed and the running result is not ok 8238 1726882397.36914: done checking to see if all hosts have failed 8238 1726882397.36915: getting the remaining hosts for this loop 8238 1726882397.36916: done getting the remaining hosts for this loop 8238 1726882397.36919: getting the next task for host managed_node3 8238 1726882397.36928: done getting next task for host managed_node3 8238 1726882397.36930: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 8238 1726882397.36934: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882397.36937: getting variables 8238 1726882397.36939: in VariableManager get_vars() 8238 1726882397.36977: Calling all_inventory to load vars for managed_node3 8238 1726882397.36980: Calling groups_inventory to load vars for managed_node3 8238 1726882397.36983: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882397.36993: Calling all_plugins_play to load vars for managed_node3 8238 1726882397.36996: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882397.36999: Calling groups_plugins_play to load vars for managed_node3 8238 1726882397.38661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882397.41666: done with get_vars() 8238 1726882397.41694: done getting variables 8238 1726882397.41764: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882397.41889: variable 'profile' from source: include params 8238 1726882397.41894: variable 'item' from source: include params 8238 1726882397.41961: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:17 -0400 (0:00:00.081) 0:00:27.575 ****** 8238 1726882397.41994: entering _queue_task() for managed_node3/assert 8238 1726882397.42513: worker is 1 (out of 1 available) 8238 1726882397.42532: exiting _queue_task() for managed_node3/assert 8238 1726882397.42544: done queuing things up, now waiting for results queue to drain 8238 1726882397.42546: waiting for pending results... 8238 1726882397.43115: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 8238 1726882397.43218: in run() - task 0affc7ec-ae25-54bc-d334-000000000267 8238 1726882397.43439: variable 'ansible_search_path' from source: unknown 8238 1726882397.43443: variable 'ansible_search_path' from source: unknown 8238 1726882397.43490: calling self._execute() 8238 1726882397.43700: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.43708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.43717: variable 'omit' from source: magic vars 8238 1726882397.44511: variable 'ansible_distribution_major_version' from source: facts 8238 1726882397.44525: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882397.44674: variable 'omit' from source: magic vars 8238 1726882397.44686: variable 'omit' from source: magic vars 8238 1726882397.44910: variable 'profile' from source: include params 8238 1726882397.44913: variable 'item' from source: include params 8238 1726882397.45037: variable 'item' from source: include params 8238 1726882397.45060: variable 'omit' from source: magic vars 8238 1726882397.45218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882397.45261: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882397.45279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882397.45298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882397.45426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882397.45461: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882397.45465: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.45468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.45693: Set connection var ansible_connection to ssh 8238 1726882397.45696: Set connection var ansible_shell_type to sh 8238 1726882397.45702: Set connection var ansible_pipelining to False 8238 1726882397.45753: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882397.45756: Set connection var ansible_timeout to 10 8238 1726882397.45765: Set connection var ansible_shell_executable to /bin/sh 8238 1726882397.45933: variable 'ansible_shell_executable' from source: unknown 8238 1726882397.45936: variable 'ansible_connection' from source: unknown 8238 1726882397.45939: variable 'ansible_module_compression' from source: unknown 8238 1726882397.45942: variable 'ansible_shell_type' from source: unknown 8238 1726882397.45944: variable 'ansible_shell_executable' from source: unknown 8238 1726882397.45983: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.45986: variable 'ansible_pipelining' from source: unknown 8238 1726882397.45989: variable 'ansible_timeout' from source: unknown 8238 1726882397.45992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.46241: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882397.46253: variable 'omit' from source: magic vars 8238 1726882397.46313: starting attempt loop 8238 1726882397.46317: running the handler 8238 1726882397.46498: variable 'lsr_net_profile_exists' from source: set_fact 8238 1726882397.46620: Evaluated conditional (lsr_net_profile_exists): True 8238 1726882397.46628: handler run complete 8238 1726882397.46646: attempt loop complete, returning result 8238 1726882397.46649: _execute() done 8238 1726882397.46652: dumping result to json 8238 1726882397.46659: done dumping result, returning 8238 1726882397.46667: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [0affc7ec-ae25-54bc-d334-000000000267] 8238 1726882397.46727: sending task result for task 0affc7ec-ae25-54bc-d334-000000000267 8238 1726882397.47028: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000267 8238 1726882397.47032: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882397.47080: no more pending results, returning what we have 8238 1726882397.47083: results queue empty 8238 1726882397.47084: checking for any_errors_fatal 8238 1726882397.47089: done checking for any_errors_fatal 8238 1726882397.47090: checking for max_fail_percentage 8238 1726882397.47091: done checking for max_fail_percentage 8238 1726882397.47092: checking to see if all hosts have failed and the running result is not ok 8238 1726882397.47093: done checking to see if all hosts have failed 8238 1726882397.47094: getting the remaining hosts for this loop 8238 1726882397.47096: done getting the remaining hosts for this loop 8238 1726882397.47099: getting the next task for host managed_node3 8238 1726882397.47105: done getting next task for host managed_node3 8238 1726882397.47108: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 8238 1726882397.47111: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882397.47114: getting variables 8238 1726882397.47116: in VariableManager get_vars() 8238 1726882397.47156: Calling all_inventory to load vars for managed_node3 8238 1726882397.47159: Calling groups_inventory to load vars for managed_node3 8238 1726882397.47162: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882397.47173: Calling all_plugins_play to load vars for managed_node3 8238 1726882397.47176: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882397.47180: Calling groups_plugins_play to load vars for managed_node3 8238 1726882397.50417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882397.54659: done with get_vars() 8238 1726882397.54696: done getting variables 8238 1726882397.55069: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882397.55195: variable 'profile' from source: include params 8238 1726882397.55199: variable 'item' from source: include params 8238 1726882397.55466: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:17 -0400 (0:00:00.135) 0:00:27.710 ****** 8238 1726882397.55506: entering _queue_task() for managed_node3/assert 8238 1726882397.56268: worker is 1 (out of 1 available) 8238 1726882397.56281: exiting _queue_task() for managed_node3/assert 8238 1726882397.56293: done queuing things up, now waiting for results queue to drain 8238 1726882397.56295: waiting for pending results... 8238 1726882397.56635: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 8238 1726882397.56981: in run() - task 0affc7ec-ae25-54bc-d334-000000000268 8238 1726882397.56995: variable 'ansible_search_path' from source: unknown 8238 1726882397.56999: variable 'ansible_search_path' from source: unknown 8238 1726882397.57042: calling self._execute() 8238 1726882397.57303: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.57312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.57325: variable 'omit' from source: magic vars 8238 1726882397.58185: variable 'ansible_distribution_major_version' from source: facts 8238 1726882397.58197: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882397.58204: variable 'omit' from source: magic vars 8238 1726882397.58276: variable 'omit' from source: magic vars 8238 1726882397.58514: variable 'profile' from source: include params 8238 1726882397.58518: variable 'item' from source: include params 8238 1726882397.58588: variable 'item' from source: include params 8238 1726882397.58719: variable 'omit' from source: magic vars 8238 1726882397.58821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882397.58826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882397.58942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882397.58967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882397.58979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882397.59014: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882397.59017: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.59020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.59378: Set connection var ansible_connection to ssh 8238 1726882397.59381: Set connection var ansible_shell_type to sh 8238 1726882397.59475: Set connection var ansible_pipelining to False 8238 1726882397.59478: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882397.59480: Set connection var ansible_timeout to 10 8238 1726882397.59483: Set connection var ansible_shell_executable to /bin/sh 8238 1726882397.59485: variable 'ansible_shell_executable' from source: unknown 8238 1726882397.59487: variable 'ansible_connection' from source: unknown 8238 1726882397.59490: variable 'ansible_module_compression' from source: unknown 8238 1726882397.59492: variable 'ansible_shell_type' from source: unknown 8238 1726882397.59494: variable 'ansible_shell_executable' from source: unknown 8238 1726882397.59496: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.59498: variable 'ansible_pipelining' from source: unknown 8238 1726882397.59500: variable 'ansible_timeout' from source: unknown 8238 1726882397.59502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.59698: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882397.59827: variable 'omit' from source: magic vars 8238 1726882397.59834: starting attempt loop 8238 1726882397.59837: running the handler 8238 1726882397.60066: variable 'lsr_net_profile_ansible_managed' from source: set_fact 8238 1726882397.60070: Evaluated conditional (lsr_net_profile_ansible_managed): True 8238 1726882397.60078: handler run complete 8238 1726882397.60097: attempt loop complete, returning result 8238 1726882397.60100: _execute() done 8238 1726882397.60103: dumping result to json 8238 1726882397.60105: done dumping result, returning 8238 1726882397.60129: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0affc7ec-ae25-54bc-d334-000000000268] 8238 1726882397.60132: sending task result for task 0affc7ec-ae25-54bc-d334-000000000268 8238 1726882397.60396: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000268 8238 1726882397.60399: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882397.60448: no more pending results, returning what we have 8238 1726882397.60452: results queue empty 8238 1726882397.60453: checking for any_errors_fatal 8238 1726882397.60459: done checking for any_errors_fatal 8238 1726882397.60460: checking for max_fail_percentage 8238 1726882397.60461: done checking for max_fail_percentage 8238 1726882397.60462: checking to see if all hosts have failed and the running result is not ok 8238 1726882397.60463: done checking to see if all hosts have failed 8238 1726882397.60464: getting the remaining hosts for this loop 8238 1726882397.60465: done getting the remaining hosts for this loop 8238 1726882397.60470: getting the next task for host managed_node3 8238 1726882397.60476: done getting next task for host managed_node3 8238 1726882397.60478: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 8238 1726882397.60482: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882397.60486: getting variables 8238 1726882397.60487: in VariableManager get_vars() 8238 1726882397.60529: Calling all_inventory to load vars for managed_node3 8238 1726882397.60532: Calling groups_inventory to load vars for managed_node3 8238 1726882397.60534: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882397.60545: Calling all_plugins_play to load vars for managed_node3 8238 1726882397.60548: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882397.60551: Calling groups_plugins_play to load vars for managed_node3 8238 1726882397.64140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882397.68200: done with get_vars() 8238 1726882397.68435: done getting variables 8238 1726882397.68500: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882397.68618: variable 'profile' from source: include params 8238 1726882397.68825: variable 'item' from source: include params 8238 1726882397.68890: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:17 -0400 (0:00:00.134) 0:00:27.844 ****** 8238 1726882397.68933: entering _queue_task() for managed_node3/assert 8238 1726882397.69692: worker is 1 (out of 1 available) 8238 1726882397.69707: exiting _queue_task() for managed_node3/assert 8238 1726882397.69720: done queuing things up, now waiting for results queue to drain 8238 1726882397.69924: waiting for pending results... 8238 1726882397.70340: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 8238 1726882397.70477: in run() - task 0affc7ec-ae25-54bc-d334-000000000269 8238 1726882397.70483: variable 'ansible_search_path' from source: unknown 8238 1726882397.70486: variable 'ansible_search_path' from source: unknown 8238 1726882397.70602: calling self._execute() 8238 1726882397.70766: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.70774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.70802: variable 'omit' from source: magic vars 8238 1726882397.71698: variable 'ansible_distribution_major_version' from source: facts 8238 1726882397.71784: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882397.71789: variable 'omit' from source: magic vars 8238 1726882397.71879: variable 'omit' from source: magic vars 8238 1726882397.72065: variable 'profile' from source: include params 8238 1726882397.72075: variable 'item' from source: include params 8238 1726882397.72144: variable 'item' from source: include params 8238 1726882397.72166: variable 'omit' from source: magic vars 8238 1726882397.72326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882397.72366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882397.72385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882397.72523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882397.72535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882397.72657: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882397.72660: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.72663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.72809: Set connection var ansible_connection to ssh 8238 1726882397.72812: Set connection var ansible_shell_type to sh 8238 1726882397.72817: Set connection var ansible_pipelining to False 8238 1726882397.72825: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882397.72950: Set connection var ansible_timeout to 10 8238 1726882397.72960: Set connection var ansible_shell_executable to /bin/sh 8238 1726882397.72987: variable 'ansible_shell_executable' from source: unknown 8238 1726882397.72990: variable 'ansible_connection' from source: unknown 8238 1726882397.72992: variable 'ansible_module_compression' from source: unknown 8238 1726882397.72995: variable 'ansible_shell_type' from source: unknown 8238 1726882397.72997: variable 'ansible_shell_executable' from source: unknown 8238 1726882397.73001: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.73006: variable 'ansible_pipelining' from source: unknown 8238 1726882397.73009: variable 'ansible_timeout' from source: unknown 8238 1726882397.73014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.73345: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882397.73418: variable 'omit' from source: magic vars 8238 1726882397.73421: starting attempt loop 8238 1726882397.73426: running the handler 8238 1726882397.73713: variable 'lsr_net_profile_fingerprint' from source: set_fact 8238 1726882397.73719: Evaluated conditional (lsr_net_profile_fingerprint): True 8238 1726882397.73726: handler run complete 8238 1726882397.73833: attempt loop complete, returning result 8238 1726882397.73837: _execute() done 8238 1726882397.73840: dumping result to json 8238 1726882397.73842: done dumping result, returning 8238 1726882397.73853: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [0affc7ec-ae25-54bc-d334-000000000269] 8238 1726882397.73859: sending task result for task 0affc7ec-ae25-54bc-d334-000000000269 8238 1726882397.73957: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000269 8238 1726882397.73963: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882397.74015: no more pending results, returning what we have 8238 1726882397.74019: results queue empty 8238 1726882397.74021: checking for any_errors_fatal 8238 1726882397.74030: done checking for any_errors_fatal 8238 1726882397.74031: checking for max_fail_percentage 8238 1726882397.74033: done checking for max_fail_percentage 8238 1726882397.74034: checking to see if all hosts have failed and the running result is not ok 8238 1726882397.74035: done checking to see if all hosts have failed 8238 1726882397.74035: getting the remaining hosts for this loop 8238 1726882397.74037: done getting the remaining hosts for this loop 8238 1726882397.74040: getting the next task for host managed_node3 8238 1726882397.74050: done getting next task for host managed_node3 8238 1726882397.74054: ^ task is: TASK: Include the task 'get_profile_stat.yml' 8238 1726882397.74058: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882397.74062: getting variables 8238 1726882397.74063: in VariableManager get_vars() 8238 1726882397.74103: Calling all_inventory to load vars for managed_node3 8238 1726882397.74106: Calling groups_inventory to load vars for managed_node3 8238 1726882397.74108: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882397.74120: Calling all_plugins_play to load vars for managed_node3 8238 1726882397.74426: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882397.74432: Calling groups_plugins_play to load vars for managed_node3 8238 1726882397.77672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882397.82086: done with get_vars() 8238 1726882397.82113: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:17 -0400 (0:00:00.132) 0:00:27.977 ****** 8238 1726882397.82215: entering _queue_task() for managed_node3/include_tasks 8238 1726882397.82851: worker is 1 (out of 1 available) 8238 1726882397.82864: exiting _queue_task() for managed_node3/include_tasks 8238 1726882397.82879: done queuing things up, now waiting for results queue to drain 8238 1726882397.82881: waiting for pending results... 8238 1726882397.83340: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 8238 1726882397.83345: in run() - task 0affc7ec-ae25-54bc-d334-00000000026d 8238 1726882397.83349: variable 'ansible_search_path' from source: unknown 8238 1726882397.83352: variable 'ansible_search_path' from source: unknown 8238 1726882397.83398: calling self._execute() 8238 1726882397.83501: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882397.83579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882397.83582: variable 'omit' from source: magic vars 8238 1726882397.83932: variable 'ansible_distribution_major_version' from source: facts 8238 1726882397.83951: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882397.83962: _execute() done 8238 1726882397.83971: dumping result to json 8238 1726882397.83980: done dumping result, returning 8238 1726882397.83991: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affc7ec-ae25-54bc-d334-00000000026d] 8238 1726882397.84000: sending task result for task 0affc7ec-ae25-54bc-d334-00000000026d 8238 1726882397.84158: no more pending results, returning what we have 8238 1726882397.84165: in VariableManager get_vars() 8238 1726882397.84216: Calling all_inventory to load vars for managed_node3 8238 1726882397.84220: Calling groups_inventory to load vars for managed_node3 8238 1726882397.84225: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882397.84241: Calling all_plugins_play to load vars for managed_node3 8238 1726882397.84244: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882397.84247: Calling groups_plugins_play to load vars for managed_node3 8238 1726882397.90533: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000026d 8238 1726882397.90538: WORKER PROCESS EXITING 8238 1726882397.91676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882397.94751: done with get_vars() 8238 1726882397.94782: variable 'ansible_search_path' from source: unknown 8238 1726882397.94783: variable 'ansible_search_path' from source: unknown 8238 1726882397.94824: we have included files to process 8238 1726882397.94825: generating all_blocks data 8238 1726882397.94827: done generating all_blocks data 8238 1726882397.94830: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8238 1726882397.94831: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8238 1726882397.94833: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8238 1726882397.95784: done processing included file 8238 1726882397.95787: iterating over new_blocks loaded from include file 8238 1726882397.95788: in VariableManager get_vars() 8238 1726882397.95812: done with get_vars() 8238 1726882397.95814: filtering new block on tags 8238 1726882397.95843: done filtering new block on tags 8238 1726882397.95846: in VariableManager get_vars() 8238 1726882397.95869: done with get_vars() 8238 1726882397.95871: filtering new block on tags 8238 1726882397.95895: done filtering new block on tags 8238 1726882397.95898: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 8238 1726882397.95902: extending task lists for all hosts with included blocks 8238 1726882397.96108: done extending task lists 8238 1726882397.96109: done processing included files 8238 1726882397.96110: results queue empty 8238 1726882397.96111: checking for any_errors_fatal 8238 1726882397.96114: done checking for any_errors_fatal 8238 1726882397.96115: checking for max_fail_percentage 8238 1726882397.96116: done checking for max_fail_percentage 8238 1726882397.96117: checking to see if all hosts have failed and the running result is not ok 8238 1726882397.96118: done checking to see if all hosts have failed 8238 1726882397.96118: getting the remaining hosts for this loop 8238 1726882397.96119: done getting the remaining hosts for this loop 8238 1726882397.96124: getting the next task for host managed_node3 8238 1726882397.96129: done getting next task for host managed_node3 8238 1726882397.96131: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 8238 1726882397.96134: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882397.96136: getting variables 8238 1726882397.96137: in VariableManager get_vars() 8238 1726882397.96150: Calling all_inventory to load vars for managed_node3 8238 1726882397.96156: Calling groups_inventory to load vars for managed_node3 8238 1726882397.96158: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882397.96164: Calling all_plugins_play to load vars for managed_node3 8238 1726882397.96166: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882397.96169: Calling groups_plugins_play to load vars for managed_node3 8238 1726882397.97750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882397.99906: done with get_vars() 8238 1726882397.99942: done getting variables 8238 1726882397.99998: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:18 -0400 (0:00:00.178) 0:00:28.155 ****** 8238 1726882398.00034: entering _queue_task() for managed_node3/set_fact 8238 1726882398.00418: worker is 1 (out of 1 available) 8238 1726882398.00634: exiting _queue_task() for managed_node3/set_fact 8238 1726882398.00646: done queuing things up, now waiting for results queue to drain 8238 1726882398.00648: waiting for pending results... 8238 1726882398.00764: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 8238 1726882398.00915: in run() - task 0affc7ec-ae25-54bc-d334-000000000440 8238 1726882398.00941: variable 'ansible_search_path' from source: unknown 8238 1726882398.00948: variable 'ansible_search_path' from source: unknown 8238 1726882398.00999: calling self._execute() 8238 1726882398.01110: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.01124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.01139: variable 'omit' from source: magic vars 8238 1726882398.01584: variable 'ansible_distribution_major_version' from source: facts 8238 1726882398.01603: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882398.01616: variable 'omit' from source: magic vars 8238 1726882398.01685: variable 'omit' from source: magic vars 8238 1726882398.01732: variable 'omit' from source: magic vars 8238 1726882398.01786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882398.01832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882398.01865: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882398.01889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882398.01908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882398.01948: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882398.01963: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.01977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.02128: Set connection var ansible_connection to ssh 8238 1726882398.02131: Set connection var ansible_shell_type to sh 8238 1726882398.02134: Set connection var ansible_pipelining to False 8238 1726882398.02137: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882398.02139: Set connection var ansible_timeout to 10 8238 1726882398.02141: Set connection var ansible_shell_executable to /bin/sh 8238 1726882398.02174: variable 'ansible_shell_executable' from source: unknown 8238 1726882398.02296: variable 'ansible_connection' from source: unknown 8238 1726882398.02300: variable 'ansible_module_compression' from source: unknown 8238 1726882398.02302: variable 'ansible_shell_type' from source: unknown 8238 1726882398.02305: variable 'ansible_shell_executable' from source: unknown 8238 1726882398.02308: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.02310: variable 'ansible_pipelining' from source: unknown 8238 1726882398.02313: variable 'ansible_timeout' from source: unknown 8238 1726882398.02315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.02390: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882398.02413: variable 'omit' from source: magic vars 8238 1726882398.02425: starting attempt loop 8238 1726882398.02432: running the handler 8238 1726882398.02447: handler run complete 8238 1726882398.02466: attempt loop complete, returning result 8238 1726882398.02473: _execute() done 8238 1726882398.02480: dumping result to json 8238 1726882398.02487: done dumping result, returning 8238 1726882398.02498: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affc7ec-ae25-54bc-d334-000000000440] 8238 1726882398.02511: sending task result for task 0affc7ec-ae25-54bc-d334-000000000440 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 8238 1726882398.02675: no more pending results, returning what we have 8238 1726882398.02679: results queue empty 8238 1726882398.02680: checking for any_errors_fatal 8238 1726882398.02682: done checking for any_errors_fatal 8238 1726882398.02683: checking for max_fail_percentage 8238 1726882398.02684: done checking for max_fail_percentage 8238 1726882398.02685: checking to see if all hosts have failed and the running result is not ok 8238 1726882398.02686: done checking to see if all hosts have failed 8238 1726882398.02687: getting the remaining hosts for this loop 8238 1726882398.02689: done getting the remaining hosts for this loop 8238 1726882398.02693: getting the next task for host managed_node3 8238 1726882398.02701: done getting next task for host managed_node3 8238 1726882398.02704: ^ task is: TASK: Stat profile file 8238 1726882398.02709: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882398.02714: getting variables 8238 1726882398.02716: in VariableManager get_vars() 8238 1726882398.02764: Calling all_inventory to load vars for managed_node3 8238 1726882398.02767: Calling groups_inventory to load vars for managed_node3 8238 1726882398.02770: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882398.02782: Calling all_plugins_play to load vars for managed_node3 8238 1726882398.02785: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882398.02788: Calling groups_plugins_play to load vars for managed_node3 8238 1726882398.03468: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000440 8238 1726882398.03471: WORKER PROCESS EXITING 8238 1726882398.04746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882398.07019: done with get_vars() 8238 1726882398.07046: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:18 -0400 (0:00:00.071) 0:00:28.226 ****** 8238 1726882398.07147: entering _queue_task() for managed_node3/stat 8238 1726882398.07483: worker is 1 (out of 1 available) 8238 1726882398.07498: exiting _queue_task() for managed_node3/stat 8238 1726882398.07511: done queuing things up, now waiting for results queue to drain 8238 1726882398.07513: waiting for pending results... 8238 1726882398.07800: running TaskExecutor() for managed_node3/TASK: Stat profile file 8238 1726882398.07931: in run() - task 0affc7ec-ae25-54bc-d334-000000000441 8238 1726882398.07959: variable 'ansible_search_path' from source: unknown 8238 1726882398.07966: variable 'ansible_search_path' from source: unknown 8238 1726882398.08009: calling self._execute() 8238 1726882398.08127: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.08139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.08157: variable 'omit' from source: magic vars 8238 1726882398.08575: variable 'ansible_distribution_major_version' from source: facts 8238 1726882398.08595: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882398.08607: variable 'omit' from source: magic vars 8238 1726882398.08669: variable 'omit' from source: magic vars 8238 1726882398.08784: variable 'profile' from source: include params 8238 1726882398.08818: variable 'item' from source: include params 8238 1726882398.08876: variable 'item' from source: include params 8238 1726882398.08901: variable 'omit' from source: magic vars 8238 1726882398.08956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882398.09128: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882398.09131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882398.09134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882398.09137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882398.09140: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882398.09142: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.09144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.09225: Set connection var ansible_connection to ssh 8238 1726882398.09234: Set connection var ansible_shell_type to sh 8238 1726882398.09244: Set connection var ansible_pipelining to False 8238 1726882398.09260: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882398.09271: Set connection var ansible_timeout to 10 8238 1726882398.09284: Set connection var ansible_shell_executable to /bin/sh 8238 1726882398.09311: variable 'ansible_shell_executable' from source: unknown 8238 1726882398.09319: variable 'ansible_connection' from source: unknown 8238 1726882398.09329: variable 'ansible_module_compression' from source: unknown 8238 1726882398.09336: variable 'ansible_shell_type' from source: unknown 8238 1726882398.09342: variable 'ansible_shell_executable' from source: unknown 8238 1726882398.09349: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.09360: variable 'ansible_pipelining' from source: unknown 8238 1726882398.09371: variable 'ansible_timeout' from source: unknown 8238 1726882398.09379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.09601: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882398.09695: variable 'omit' from source: magic vars 8238 1726882398.09698: starting attempt loop 8238 1726882398.09701: running the handler 8238 1726882398.09703: _low_level_execute_command(): starting 8238 1726882398.09706: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882398.10448: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882398.10470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882398.10583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.10597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882398.10611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882398.10637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.10767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.12546: stdout chunk (state=3): >>>/root <<< 8238 1726882398.12735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882398.12747: stdout chunk (state=3): >>><<< 8238 1726882398.12768: stderr chunk (state=3): >>><<< 8238 1726882398.12796: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882398.12816: _low_level_execute_command(): starting 8238 1726882398.12829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326 `" && echo ansible-tmp-1726882398.1280258-9323-174637682183326="` echo /root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326 `" ) && sleep 0' 8238 1726882398.13501: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882398.13516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882398.13534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882398.13551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882398.13677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882398.13707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.13827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.23797: stdout chunk (state=3): >>>ansible-tmp-1726882398.1280258-9323-174637682183326=/root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326 <<< 8238 1726882398.24130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882398.24134: stdout chunk (state=3): >>><<< 8238 1726882398.24136: stderr chunk (state=3): >>><<< 8238 1726882398.24140: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882398.1280258-9323-174637682183326=/root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882398.24142: variable 'ansible_module_compression' from source: unknown 8238 1726882398.24321: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8238 1726882398.24329: variable 'ansible_facts' from source: unknown 8238 1726882398.24529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/AnsiballZ_stat.py 8238 1726882398.24860: Sending initial data 8238 1726882398.24864: Sent initial data (151 bytes) 8238 1726882398.25698: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882398.25703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882398.25766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.25771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882398.25860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.25940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.27603: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882398.27692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882398.27781: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpzbtksdvw /root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/AnsiballZ_stat.py <<< 8238 1726882398.27796: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/AnsiballZ_stat.py" <<< 8238 1726882398.27866: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 8238 1726882398.27889: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpzbtksdvw" to remote "/root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/AnsiballZ_stat.py" <<< 8238 1726882398.28999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882398.29003: stderr chunk (state=3): >>><<< 8238 1726882398.29006: stdout chunk (state=3): >>><<< 8238 1726882398.29008: done transferring module to remote 8238 1726882398.29010: _low_level_execute_command(): starting 8238 1726882398.29012: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/ /root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/AnsiballZ_stat.py && sleep 0' 8238 1726882398.29740: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.29805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882398.29824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882398.29849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.29985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.31930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882398.31948: stdout chunk (state=3): >>><<< 8238 1726882398.31961: stderr chunk (state=3): >>><<< 8238 1726882398.32070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882398.32073: _low_level_execute_command(): starting 8238 1726882398.32076: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/AnsiballZ_stat.py && sleep 0' 8238 1726882398.32680: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882398.32696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882398.32711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882398.32732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882398.32783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.32861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882398.32891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882398.32915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.33034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.49603: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8238 1726882398.51129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882398.51135: stdout chunk (state=3): >>><<< 8238 1726882398.51137: stderr chunk (state=3): >>><<< 8238 1726882398.51140: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882398.51142: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882398.51145: _low_level_execute_command(): starting 8238 1726882398.51147: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882398.1280258-9323-174637682183326/ > /dev/null 2>&1 && sleep 0' 8238 1726882398.51595: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882398.51599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882398.51632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.51635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882398.51638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882398.51677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.51719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882398.51728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.51815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.53734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882398.53772: stderr chunk (state=3): >>><<< 8238 1726882398.53775: stdout chunk (state=3): >>><<< 8238 1726882398.53788: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882398.53794: handler run complete 8238 1726882398.53811: attempt loop complete, returning result 8238 1726882398.53814: _execute() done 8238 1726882398.53817: dumping result to json 8238 1726882398.53823: done dumping result, returning 8238 1726882398.53831: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affc7ec-ae25-54bc-d334-000000000441] 8238 1726882398.53836: sending task result for task 0affc7ec-ae25-54bc-d334-000000000441 8238 1726882398.53953: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000441 8238 1726882398.53956: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8238 1726882398.54285: no more pending results, returning what we have 8238 1726882398.54288: results queue empty 8238 1726882398.54289: checking for any_errors_fatal 8238 1726882398.54295: done checking for any_errors_fatal 8238 1726882398.54296: checking for max_fail_percentage 8238 1726882398.54297: done checking for max_fail_percentage 8238 1726882398.54298: checking to see if all hosts have failed and the running result is not ok 8238 1726882398.54299: done checking to see if all hosts have failed 8238 1726882398.54300: getting the remaining hosts for this loop 8238 1726882398.54302: done getting the remaining hosts for this loop 8238 1726882398.54305: getting the next task for host managed_node3 8238 1726882398.54311: done getting next task for host managed_node3 8238 1726882398.54314: ^ task is: TASK: Set NM profile exist flag based on the profile files 8238 1726882398.54318: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882398.54324: getting variables 8238 1726882398.54326: in VariableManager get_vars() 8238 1726882398.54374: Calling all_inventory to load vars for managed_node3 8238 1726882398.54378: Calling groups_inventory to load vars for managed_node3 8238 1726882398.54380: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882398.54391: Calling all_plugins_play to load vars for managed_node3 8238 1726882398.54394: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882398.54398: Calling groups_plugins_play to load vars for managed_node3 8238 1726882398.55642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882398.56794: done with get_vars() 8238 1726882398.56814: done getting variables 8238 1726882398.56868: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:18 -0400 (0:00:00.497) 0:00:28.723 ****** 8238 1726882398.56891: entering _queue_task() for managed_node3/set_fact 8238 1726882398.57242: worker is 1 (out of 1 available) 8238 1726882398.57254: exiting _queue_task() for managed_node3/set_fact 8238 1726882398.57268: done queuing things up, now waiting for results queue to drain 8238 1726882398.57270: waiting for pending results... 8238 1726882398.57653: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 8238 1726882398.57730: in run() - task 0affc7ec-ae25-54bc-d334-000000000442 8238 1726882398.57734: variable 'ansible_search_path' from source: unknown 8238 1726882398.57737: variable 'ansible_search_path' from source: unknown 8238 1726882398.57753: calling self._execute() 8238 1726882398.57859: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.57872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.57886: variable 'omit' from source: magic vars 8238 1726882398.58291: variable 'ansible_distribution_major_version' from source: facts 8238 1726882398.58309: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882398.58445: variable 'profile_stat' from source: set_fact 8238 1726882398.58509: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882398.58513: when evaluation is False, skipping this task 8238 1726882398.58516: _execute() done 8238 1726882398.58518: dumping result to json 8238 1726882398.58520: done dumping result, returning 8238 1726882398.58525: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affc7ec-ae25-54bc-d334-000000000442] 8238 1726882398.58528: sending task result for task 0affc7ec-ae25-54bc-d334-000000000442 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882398.58665: no more pending results, returning what we have 8238 1726882398.58670: results queue empty 8238 1726882398.58671: checking for any_errors_fatal 8238 1726882398.58680: done checking for any_errors_fatal 8238 1726882398.58681: checking for max_fail_percentage 8238 1726882398.58682: done checking for max_fail_percentage 8238 1726882398.58683: checking to see if all hosts have failed and the running result is not ok 8238 1726882398.58684: done checking to see if all hosts have failed 8238 1726882398.58685: getting the remaining hosts for this loop 8238 1726882398.58687: done getting the remaining hosts for this loop 8238 1726882398.58691: getting the next task for host managed_node3 8238 1726882398.58699: done getting next task for host managed_node3 8238 1726882398.58702: ^ task is: TASK: Get NM profile info 8238 1726882398.58707: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882398.58712: getting variables 8238 1726882398.58714: in VariableManager get_vars() 8238 1726882398.58759: Calling all_inventory to load vars for managed_node3 8238 1726882398.58763: Calling groups_inventory to load vars for managed_node3 8238 1726882398.58765: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882398.58781: Calling all_plugins_play to load vars for managed_node3 8238 1726882398.58784: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882398.58788: Calling groups_plugins_play to load vars for managed_node3 8238 1726882398.59605: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000442 8238 1726882398.59610: WORKER PROCESS EXITING 8238 1726882398.60835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882398.62880: done with get_vars() 8238 1726882398.62909: done getting variables 8238 1726882398.62979: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:18 -0400 (0:00:00.061) 0:00:28.785 ****** 8238 1726882398.63012: entering _queue_task() for managed_node3/shell 8238 1726882398.63344: worker is 1 (out of 1 available) 8238 1726882398.63359: exiting _queue_task() for managed_node3/shell 8238 1726882398.63373: done queuing things up, now waiting for results queue to drain 8238 1726882398.63375: waiting for pending results... 8238 1726882398.63672: running TaskExecutor() for managed_node3/TASK: Get NM profile info 8238 1726882398.63810: in run() - task 0affc7ec-ae25-54bc-d334-000000000443 8238 1726882398.63836: variable 'ansible_search_path' from source: unknown 8238 1726882398.63849: variable 'ansible_search_path' from source: unknown 8238 1726882398.63895: calling self._execute() 8238 1726882398.64227: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.64232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.64235: variable 'omit' from source: magic vars 8238 1726882398.64423: variable 'ansible_distribution_major_version' from source: facts 8238 1726882398.64443: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882398.64461: variable 'omit' from source: magic vars 8238 1726882398.64521: variable 'omit' from source: magic vars 8238 1726882398.64640: variable 'profile' from source: include params 8238 1726882398.64650: variable 'item' from source: include params 8238 1726882398.64719: variable 'item' from source: include params 8238 1726882398.64746: variable 'omit' from source: magic vars 8238 1726882398.64797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882398.64844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882398.64870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882398.64899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882398.64915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882398.64953: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882398.64962: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.64970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.65083: Set connection var ansible_connection to ssh 8238 1726882398.65091: Set connection var ansible_shell_type to sh 8238 1726882398.65100: Set connection var ansible_pipelining to False 8238 1726882398.65114: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882398.65126: Set connection var ansible_timeout to 10 8238 1726882398.65138: Set connection var ansible_shell_executable to /bin/sh 8238 1726882398.65165: variable 'ansible_shell_executable' from source: unknown 8238 1726882398.65173: variable 'ansible_connection' from source: unknown 8238 1726882398.65181: variable 'ansible_module_compression' from source: unknown 8238 1726882398.65188: variable 'ansible_shell_type' from source: unknown 8238 1726882398.65194: variable 'ansible_shell_executable' from source: unknown 8238 1726882398.65200: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882398.65209: variable 'ansible_pipelining' from source: unknown 8238 1726882398.65327: variable 'ansible_timeout' from source: unknown 8238 1726882398.65333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882398.65398: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882398.65416: variable 'omit' from source: magic vars 8238 1726882398.65429: starting attempt loop 8238 1726882398.65441: running the handler 8238 1726882398.65455: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882398.65479: _low_level_execute_command(): starting 8238 1726882398.65491: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882398.66326: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.66366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882398.66384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882398.66411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.66536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.68325: stdout chunk (state=3): >>>/root <<< 8238 1726882398.68479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882398.68486: stderr chunk (state=3): >>><<< 8238 1726882398.68489: stdout chunk (state=3): >>><<< 8238 1726882398.68728: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882398.68733: _low_level_execute_command(): starting 8238 1726882398.68736: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607 `" && echo ansible-tmp-1726882398.6852992-9350-114081379370607="` echo /root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607 `" ) && sleep 0' 8238 1726882398.69260: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882398.69270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882398.69281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882398.69295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882398.69309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882398.69324: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882398.69336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.69351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882398.69363: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882398.69371: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8238 1726882398.69403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.69513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882398.69537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.69658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.71829: stdout chunk (state=3): >>>ansible-tmp-1726882398.6852992-9350-114081379370607=/root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607 <<< 8238 1726882398.71833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882398.71852: stdout chunk (state=3): >>><<< 8238 1726882398.71855: stderr chunk (state=3): >>><<< 8238 1726882398.71863: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882398.6852992-9350-114081379370607=/root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882398.71899: variable 'ansible_module_compression' from source: unknown 8238 1726882398.71951: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882398.71996: variable 'ansible_facts' from source: unknown 8238 1726882398.72279: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/AnsiballZ_command.py 8238 1726882398.72648: Sending initial data 8238 1726882398.72658: Sent initial data (154 bytes) 8238 1726882398.73190: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882398.73237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.73301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882398.73318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882398.73351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.73440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.75245: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882398.75250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882398.75340: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp_p1qwwnm /root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/AnsiballZ_command.py <<< 8238 1726882398.75344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/AnsiballZ_command.py" <<< 8238 1726882398.75585: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp_p1qwwnm" to remote "/root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/AnsiballZ_command.py" <<< 8238 1726882398.76630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882398.76727: stderr chunk (state=3): >>><<< 8238 1726882398.76745: stdout chunk (state=3): >>><<< 8238 1726882398.76775: done transferring module to remote 8238 1726882398.76790: _low_level_execute_command(): starting 8238 1726882398.76798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/ /root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/AnsiballZ_command.py && sleep 0' 8238 1726882398.77418: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882398.77433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882398.77448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882398.77481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882398.77497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882398.77507: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882398.77520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882398.77546: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882398.77561: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882398.77577: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8238 1726882398.77673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882398.77698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882398.77712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.77850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882398.79770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882398.79825: stderr chunk (state=3): >>><<< 8238 1726882398.79835: stdout chunk (state=3): >>><<< 8238 1726882398.79859: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882398.80062: _low_level_execute_command(): starting 8238 1726882398.80066: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/AnsiballZ_command.py && sleep 0' 8238 1726882398.81236: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882398.81440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882398.81561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882399.00617: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:18.981773", "end": "2024-09-20 21:33:19.004535", "delta": "0:00:00.022762", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882399.02289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882399.02473: stdout chunk (state=3): >>><<< 8238 1726882399.02476: stderr chunk (state=3): >>><<< 8238 1726882399.02479: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:18.981773", "end": "2024-09-20 21:33:19.004535", "delta": "0:00:00.022762", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882399.02483: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882399.02490: _low_level_execute_command(): starting 8238 1726882399.02492: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882398.6852992-9350-114081379370607/ > /dev/null 2>&1 && sleep 0' 8238 1726882399.03120: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882399.03145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882399.03165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882399.03187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882399.03255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882399.03313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882399.03355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882399.03469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882399.05527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882399.05531: stderr chunk (state=3): >>><<< 8238 1726882399.05533: stdout chunk (state=3): >>><<< 8238 1726882399.05536: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882399.05539: handler run complete 8238 1726882399.05541: Evaluated conditional (False): False 8238 1726882399.05543: attempt loop complete, returning result 8238 1726882399.05546: _execute() done 8238 1726882399.05548: dumping result to json 8238 1726882399.05566: done dumping result, returning 8238 1726882399.05574: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affc7ec-ae25-54bc-d334-000000000443] 8238 1726882399.05585: sending task result for task 0affc7ec-ae25-54bc-d334-000000000443 ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.022762", "end": "2024-09-20 21:33:19.004535", "rc": 0, "start": "2024-09-20 21:33:18.981773" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 8238 1726882399.05778: no more pending results, returning what we have 8238 1726882399.05782: results queue empty 8238 1726882399.05783: checking for any_errors_fatal 8238 1726882399.05788: done checking for any_errors_fatal 8238 1726882399.05789: checking for max_fail_percentage 8238 1726882399.05791: done checking for max_fail_percentage 8238 1726882399.05792: checking to see if all hosts have failed and the running result is not ok 8238 1726882399.05794: done checking to see if all hosts have failed 8238 1726882399.05795: getting the remaining hosts for this loop 8238 1726882399.05796: done getting the remaining hosts for this loop 8238 1726882399.05801: getting the next task for host managed_node3 8238 1726882399.05809: done getting next task for host managed_node3 8238 1726882399.05811: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8238 1726882399.05815: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882399.05820: getting variables 8238 1726882399.05821: in VariableManager get_vars() 8238 1726882399.05867: Calling all_inventory to load vars for managed_node3 8238 1726882399.05870: Calling groups_inventory to load vars for managed_node3 8238 1726882399.05873: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882399.05885: Calling all_plugins_play to load vars for managed_node3 8238 1726882399.05888: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882399.05891: Calling groups_plugins_play to load vars for managed_node3 8238 1726882399.06585: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000443 8238 1726882399.06589: WORKER PROCESS EXITING 8238 1726882399.07882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882399.09127: done with get_vars() 8238 1726882399.09145: done getting variables 8238 1726882399.09195: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:19 -0400 (0:00:00.462) 0:00:29.247 ****** 8238 1726882399.09220: entering _queue_task() for managed_node3/set_fact 8238 1726882399.09500: worker is 1 (out of 1 available) 8238 1726882399.09513: exiting _queue_task() for managed_node3/set_fact 8238 1726882399.09532: done queuing things up, now waiting for results queue to drain 8238 1726882399.09534: waiting for pending results... 8238 1726882399.09775: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8238 1726882399.09892: in run() - task 0affc7ec-ae25-54bc-d334-000000000444 8238 1726882399.09908: variable 'ansible_search_path' from source: unknown 8238 1726882399.09912: variable 'ansible_search_path' from source: unknown 8238 1726882399.09958: calling self._execute() 8238 1726882399.10061: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.10067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.10077: variable 'omit' from source: magic vars 8238 1726882399.10479: variable 'ansible_distribution_major_version' from source: facts 8238 1726882399.10490: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882399.10632: variable 'nm_profile_exists' from source: set_fact 8238 1726882399.10644: Evaluated conditional (nm_profile_exists.rc == 0): True 8238 1726882399.10648: variable 'omit' from source: magic vars 8238 1726882399.10815: variable 'omit' from source: magic vars 8238 1726882399.10818: variable 'omit' from source: magic vars 8238 1726882399.10821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882399.10830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882399.10887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882399.10912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.10931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.10988: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882399.10997: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.11005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.11131: Set connection var ansible_connection to ssh 8238 1726882399.11160: Set connection var ansible_shell_type to sh 8238 1726882399.11168: Set connection var ansible_pipelining to False 8238 1726882399.11171: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882399.11267: Set connection var ansible_timeout to 10 8238 1726882399.11270: Set connection var ansible_shell_executable to /bin/sh 8238 1726882399.11274: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.11277: variable 'ansible_connection' from source: unknown 8238 1726882399.11279: variable 'ansible_module_compression' from source: unknown 8238 1726882399.11281: variable 'ansible_shell_type' from source: unknown 8238 1726882399.11285: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.11287: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.11289: variable 'ansible_pipelining' from source: unknown 8238 1726882399.11291: variable 'ansible_timeout' from source: unknown 8238 1726882399.11293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.11474: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882399.11488: variable 'omit' from source: magic vars 8238 1726882399.11493: starting attempt loop 8238 1726882399.11496: running the handler 8238 1726882399.11519: handler run complete 8238 1726882399.11531: attempt loop complete, returning result 8238 1726882399.11534: _execute() done 8238 1726882399.11537: dumping result to json 8238 1726882399.11542: done dumping result, returning 8238 1726882399.11557: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affc7ec-ae25-54bc-d334-000000000444] 8238 1726882399.11560: sending task result for task 0affc7ec-ae25-54bc-d334-000000000444 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 8238 1726882399.11711: no more pending results, returning what we have 8238 1726882399.11715: results queue empty 8238 1726882399.11716: checking for any_errors_fatal 8238 1726882399.11725: done checking for any_errors_fatal 8238 1726882399.11726: checking for max_fail_percentage 8238 1726882399.11728: done checking for max_fail_percentage 8238 1726882399.11729: checking to see if all hosts have failed and the running result is not ok 8238 1726882399.11730: done checking to see if all hosts have failed 8238 1726882399.11730: getting the remaining hosts for this loop 8238 1726882399.11732: done getting the remaining hosts for this loop 8238 1726882399.11756: getting the next task for host managed_node3 8238 1726882399.11767: done getting next task for host managed_node3 8238 1726882399.11769: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 8238 1726882399.11773: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882399.11776: getting variables 8238 1726882399.11778: in VariableManager get_vars() 8238 1726882399.11813: Calling all_inventory to load vars for managed_node3 8238 1726882399.11816: Calling groups_inventory to load vars for managed_node3 8238 1726882399.11818: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882399.11827: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000444 8238 1726882399.11829: WORKER PROCESS EXITING 8238 1726882399.11839: Calling all_plugins_play to load vars for managed_node3 8238 1726882399.11846: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882399.11849: Calling groups_plugins_play to load vars for managed_node3 8238 1726882399.12810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882399.14465: done with get_vars() 8238 1726882399.14486: done getting variables 8238 1726882399.14536: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882399.14630: variable 'profile' from source: include params 8238 1726882399.14633: variable 'item' from source: include params 8238 1726882399.14683: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:19 -0400 (0:00:00.054) 0:00:29.302 ****** 8238 1726882399.14711: entering _queue_task() for managed_node3/command 8238 1726882399.14974: worker is 1 (out of 1 available) 8238 1726882399.14988: exiting _queue_task() for managed_node3/command 8238 1726882399.15001: done queuing things up, now waiting for results queue to drain 8238 1726882399.15002: waiting for pending results... 8238 1726882399.15186: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 8238 1726882399.15278: in run() - task 0affc7ec-ae25-54bc-d334-000000000446 8238 1726882399.15288: variable 'ansible_search_path' from source: unknown 8238 1726882399.15292: variable 'ansible_search_path' from source: unknown 8238 1726882399.15326: calling self._execute() 8238 1726882399.15405: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.15409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.15418: variable 'omit' from source: magic vars 8238 1726882399.15714: variable 'ansible_distribution_major_version' from source: facts 8238 1726882399.15729: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882399.15817: variable 'profile_stat' from source: set_fact 8238 1726882399.15831: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882399.15835: when evaluation is False, skipping this task 8238 1726882399.15837: _execute() done 8238 1726882399.15840: dumping result to json 8238 1726882399.15844: done dumping result, returning 8238 1726882399.15850: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0affc7ec-ae25-54bc-d334-000000000446] 8238 1726882399.15858: sending task result for task 0affc7ec-ae25-54bc-d334-000000000446 8238 1726882399.15951: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000446 8238 1726882399.15954: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882399.16038: no more pending results, returning what we have 8238 1726882399.16042: results queue empty 8238 1726882399.16043: checking for any_errors_fatal 8238 1726882399.16047: done checking for any_errors_fatal 8238 1726882399.16048: checking for max_fail_percentage 8238 1726882399.16049: done checking for max_fail_percentage 8238 1726882399.16050: checking to see if all hosts have failed and the running result is not ok 8238 1726882399.16051: done checking to see if all hosts have failed 8238 1726882399.16052: getting the remaining hosts for this loop 8238 1726882399.16053: done getting the remaining hosts for this loop 8238 1726882399.16056: getting the next task for host managed_node3 8238 1726882399.16065: done getting next task for host managed_node3 8238 1726882399.16067: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 8238 1726882399.16071: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882399.16075: getting variables 8238 1726882399.16076: in VariableManager get_vars() 8238 1726882399.16109: Calling all_inventory to load vars for managed_node3 8238 1726882399.16111: Calling groups_inventory to load vars for managed_node3 8238 1726882399.16113: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882399.16126: Calling all_plugins_play to load vars for managed_node3 8238 1726882399.16128: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882399.16131: Calling groups_plugins_play to load vars for managed_node3 8238 1726882399.17186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882399.18338: done with get_vars() 8238 1726882399.18355: done getting variables 8238 1726882399.18399: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882399.18479: variable 'profile' from source: include params 8238 1726882399.18481: variable 'item' from source: include params 8238 1726882399.18520: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:19 -0400 (0:00:00.038) 0:00:29.340 ****** 8238 1726882399.18545: entering _queue_task() for managed_node3/set_fact 8238 1726882399.18766: worker is 1 (out of 1 available) 8238 1726882399.18782: exiting _queue_task() for managed_node3/set_fact 8238 1726882399.18799: done queuing things up, now waiting for results queue to drain 8238 1726882399.18801: waiting for pending results... 8238 1726882399.19144: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 8238 1726882399.19149: in run() - task 0affc7ec-ae25-54bc-d334-000000000447 8238 1726882399.19158: variable 'ansible_search_path' from source: unknown 8238 1726882399.19167: variable 'ansible_search_path' from source: unknown 8238 1726882399.19216: calling self._execute() 8238 1726882399.19327: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.19340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.19358: variable 'omit' from source: magic vars 8238 1726882399.19741: variable 'ansible_distribution_major_version' from source: facts 8238 1726882399.19761: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882399.19900: variable 'profile_stat' from source: set_fact 8238 1726882399.19927: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882399.19941: when evaluation is False, skipping this task 8238 1726882399.19945: _execute() done 8238 1726882399.19948: dumping result to json 8238 1726882399.19950: done dumping result, returning 8238 1726882399.19958: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0affc7ec-ae25-54bc-d334-000000000447] 8238 1726882399.19963: sending task result for task 0affc7ec-ae25-54bc-d334-000000000447 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882399.20112: no more pending results, returning what we have 8238 1726882399.20115: results queue empty 8238 1726882399.20116: checking for any_errors_fatal 8238 1726882399.20123: done checking for any_errors_fatal 8238 1726882399.20124: checking for max_fail_percentage 8238 1726882399.20126: done checking for max_fail_percentage 8238 1726882399.20126: checking to see if all hosts have failed and the running result is not ok 8238 1726882399.20128: done checking to see if all hosts have failed 8238 1726882399.20128: getting the remaining hosts for this loop 8238 1726882399.20130: done getting the remaining hosts for this loop 8238 1726882399.20133: getting the next task for host managed_node3 8238 1726882399.20139: done getting next task for host managed_node3 8238 1726882399.20142: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 8238 1726882399.20145: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882399.20149: getting variables 8238 1726882399.20150: in VariableManager get_vars() 8238 1726882399.20186: Calling all_inventory to load vars for managed_node3 8238 1726882399.20188: Calling groups_inventory to load vars for managed_node3 8238 1726882399.20190: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882399.20200: Calling all_plugins_play to load vars for managed_node3 8238 1726882399.20203: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882399.20206: Calling groups_plugins_play to load vars for managed_node3 8238 1726882399.21114: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000447 8238 1726882399.21118: WORKER PROCESS EXITING 8238 1726882399.21137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882399.22717: done with get_vars() 8238 1726882399.22742: done getting variables 8238 1726882399.22804: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882399.22911: variable 'profile' from source: include params 8238 1726882399.22915: variable 'item' from source: include params 8238 1726882399.22975: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:19 -0400 (0:00:00.044) 0:00:29.385 ****** 8238 1726882399.23005: entering _queue_task() for managed_node3/command 8238 1726882399.23285: worker is 1 (out of 1 available) 8238 1726882399.23299: exiting _queue_task() for managed_node3/command 8238 1726882399.23313: done queuing things up, now waiting for results queue to drain 8238 1726882399.23314: waiting for pending results... 8238 1726882399.23583: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 8238 1726882399.23733: in run() - task 0affc7ec-ae25-54bc-d334-000000000448 8238 1726882399.23788: variable 'ansible_search_path' from source: unknown 8238 1726882399.23791: variable 'ansible_search_path' from source: unknown 8238 1726882399.23816: calling self._execute() 8238 1726882399.23928: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.23942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.23961: variable 'omit' from source: magic vars 8238 1726882399.24440: variable 'ansible_distribution_major_version' from source: facts 8238 1726882399.24444: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882399.24533: variable 'profile_stat' from source: set_fact 8238 1726882399.24560: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882399.24569: when evaluation is False, skipping this task 8238 1726882399.24576: _execute() done 8238 1726882399.24585: dumping result to json 8238 1726882399.24593: done dumping result, returning 8238 1726882399.24604: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0affc7ec-ae25-54bc-d334-000000000448] 8238 1726882399.24616: sending task result for task 0affc7ec-ae25-54bc-d334-000000000448 8238 1726882399.24829: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000448 8238 1726882399.24833: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882399.24882: no more pending results, returning what we have 8238 1726882399.24885: results queue empty 8238 1726882399.24886: checking for any_errors_fatal 8238 1726882399.24891: done checking for any_errors_fatal 8238 1726882399.24892: checking for max_fail_percentage 8238 1726882399.24893: done checking for max_fail_percentage 8238 1726882399.24894: checking to see if all hosts have failed and the running result is not ok 8238 1726882399.24895: done checking to see if all hosts have failed 8238 1726882399.24896: getting the remaining hosts for this loop 8238 1726882399.24897: done getting the remaining hosts for this loop 8238 1726882399.24900: getting the next task for host managed_node3 8238 1726882399.24906: done getting next task for host managed_node3 8238 1726882399.24908: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 8238 1726882399.24912: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882399.24916: getting variables 8238 1726882399.24917: in VariableManager get_vars() 8238 1726882399.24955: Calling all_inventory to load vars for managed_node3 8238 1726882399.24958: Calling groups_inventory to load vars for managed_node3 8238 1726882399.24961: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882399.24972: Calling all_plugins_play to load vars for managed_node3 8238 1726882399.24975: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882399.24978: Calling groups_plugins_play to load vars for managed_node3 8238 1726882399.26416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882399.28940: done with get_vars() 8238 1726882399.28978: done getting variables 8238 1726882399.29062: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882399.29227: variable 'profile' from source: include params 8238 1726882399.29232: variable 'item' from source: include params 8238 1726882399.29305: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:19 -0400 (0:00:00.063) 0:00:29.448 ****** 8238 1726882399.29341: entering _queue_task() for managed_node3/set_fact 8238 1726882399.29702: worker is 1 (out of 1 available) 8238 1726882399.29716: exiting _queue_task() for managed_node3/set_fact 8238 1726882399.29732: done queuing things up, now waiting for results queue to drain 8238 1726882399.29741: waiting for pending results... 8238 1726882399.30242: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 8238 1726882399.30250: in run() - task 0affc7ec-ae25-54bc-d334-000000000449 8238 1726882399.30277: variable 'ansible_search_path' from source: unknown 8238 1726882399.30292: variable 'ansible_search_path' from source: unknown 8238 1726882399.30343: calling self._execute() 8238 1726882399.30466: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.30481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.30500: variable 'omit' from source: magic vars 8238 1726882399.30932: variable 'ansible_distribution_major_version' from source: facts 8238 1726882399.30956: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882399.31095: variable 'profile_stat' from source: set_fact 8238 1726882399.31114: Evaluated conditional (profile_stat.stat.exists): False 8238 1726882399.31125: when evaluation is False, skipping this task 8238 1726882399.31133: _execute() done 8238 1726882399.31149: dumping result to json 8238 1726882399.31155: done dumping result, returning 8238 1726882399.31158: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0affc7ec-ae25-54bc-d334-000000000449] 8238 1726882399.31188: sending task result for task 0affc7ec-ae25-54bc-d334-000000000449 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8238 1726882399.31489: no more pending results, returning what we have 8238 1726882399.31492: results queue empty 8238 1726882399.31493: checking for any_errors_fatal 8238 1726882399.31498: done checking for any_errors_fatal 8238 1726882399.31499: checking for max_fail_percentage 8238 1726882399.31501: done checking for max_fail_percentage 8238 1726882399.31501: checking to see if all hosts have failed and the running result is not ok 8238 1726882399.31502: done checking to see if all hosts have failed 8238 1726882399.31503: getting the remaining hosts for this loop 8238 1726882399.31504: done getting the remaining hosts for this loop 8238 1726882399.31510: getting the next task for host managed_node3 8238 1726882399.31518: done getting next task for host managed_node3 8238 1726882399.31522: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 8238 1726882399.31525: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882399.31532: getting variables 8238 1726882399.31533: in VariableManager get_vars() 8238 1726882399.31570: Calling all_inventory to load vars for managed_node3 8238 1726882399.31572: Calling groups_inventory to load vars for managed_node3 8238 1726882399.31575: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882399.31587: Calling all_plugins_play to load vars for managed_node3 8238 1726882399.31590: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882399.31593: Calling groups_plugins_play to load vars for managed_node3 8238 1726882399.32316: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000449 8238 1726882399.32320: WORKER PROCESS EXITING 8238 1726882399.34132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882399.36930: done with get_vars() 8238 1726882399.37204: done getting variables 8238 1726882399.37457: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882399.37700: variable 'profile' from source: include params 8238 1726882399.37704: variable 'item' from source: include params 8238 1726882399.37994: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:19 -0400 (0:00:00.086) 0:00:29.535 ****** 8238 1726882399.38031: entering _queue_task() for managed_node3/assert 8238 1726882399.38893: worker is 1 (out of 1 available) 8238 1726882399.38905: exiting _queue_task() for managed_node3/assert 8238 1726882399.38919: done queuing things up, now waiting for results queue to drain 8238 1726882399.39038: waiting for pending results... 8238 1726882399.39641: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 8238 1726882399.39646: in run() - task 0affc7ec-ae25-54bc-d334-00000000026e 8238 1726882399.39650: variable 'ansible_search_path' from source: unknown 8238 1726882399.39657: variable 'ansible_search_path' from source: unknown 8238 1726882399.40029: calling self._execute() 8238 1726882399.40033: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.40037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.40041: variable 'omit' from source: magic vars 8238 1726882399.40780: variable 'ansible_distribution_major_version' from source: facts 8238 1726882399.41227: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882399.41231: variable 'omit' from source: magic vars 8238 1726882399.41233: variable 'omit' from source: magic vars 8238 1726882399.41237: variable 'profile' from source: include params 8238 1726882399.41240: variable 'item' from source: include params 8238 1726882399.41294: variable 'item' from source: include params 8238 1726882399.41728: variable 'omit' from source: magic vars 8238 1726882399.41732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882399.41736: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882399.41738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882399.41741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.41743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.41746: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882399.41748: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.41750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.42037: Set connection var ansible_connection to ssh 8238 1726882399.42045: Set connection var ansible_shell_type to sh 8238 1726882399.42059: Set connection var ansible_pipelining to False 8238 1726882399.42069: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882399.42079: Set connection var ansible_timeout to 10 8238 1726882399.42092: Set connection var ansible_shell_executable to /bin/sh 8238 1726882399.42120: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.42428: variable 'ansible_connection' from source: unknown 8238 1726882399.42431: variable 'ansible_module_compression' from source: unknown 8238 1726882399.42434: variable 'ansible_shell_type' from source: unknown 8238 1726882399.42440: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.42443: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.42445: variable 'ansible_pipelining' from source: unknown 8238 1726882399.42448: variable 'ansible_timeout' from source: unknown 8238 1726882399.42450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.42539: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882399.42559: variable 'omit' from source: magic vars 8238 1726882399.42644: starting attempt loop 8238 1726882399.42651: running the handler 8238 1726882399.42879: variable 'lsr_net_profile_exists' from source: set_fact 8238 1726882399.42890: Evaluated conditional (lsr_net_profile_exists): True 8238 1726882399.42900: handler run complete 8238 1726882399.42919: attempt loop complete, returning result 8238 1726882399.43330: _execute() done 8238 1726882399.43334: dumping result to json 8238 1726882399.43337: done dumping result, returning 8238 1726882399.43339: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [0affc7ec-ae25-54bc-d334-00000000026e] 8238 1726882399.43341: sending task result for task 0affc7ec-ae25-54bc-d334-00000000026e ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882399.43467: no more pending results, returning what we have 8238 1726882399.43472: results queue empty 8238 1726882399.43473: checking for any_errors_fatal 8238 1726882399.43481: done checking for any_errors_fatal 8238 1726882399.43482: checking for max_fail_percentage 8238 1726882399.43484: done checking for max_fail_percentage 8238 1726882399.43485: checking to see if all hosts have failed and the running result is not ok 8238 1726882399.43486: done checking to see if all hosts have failed 8238 1726882399.43487: getting the remaining hosts for this loop 8238 1726882399.43489: done getting the remaining hosts for this loop 8238 1726882399.43493: getting the next task for host managed_node3 8238 1726882399.43502: done getting next task for host managed_node3 8238 1726882399.43509: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 8238 1726882399.43513: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882399.43518: getting variables 8238 1726882399.43520: in VariableManager get_vars() 8238 1726882399.43567: Calling all_inventory to load vars for managed_node3 8238 1726882399.43571: Calling groups_inventory to load vars for managed_node3 8238 1726882399.43573: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882399.43588: Calling all_plugins_play to load vars for managed_node3 8238 1726882399.43591: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882399.43594: Calling groups_plugins_play to load vars for managed_node3 8238 1726882399.44385: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000026e 8238 1726882399.44929: WORKER PROCESS EXITING 8238 1726882399.47066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882399.51195: done with get_vars() 8238 1726882399.51231: done getting variables 8238 1726882399.51304: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882399.51455: variable 'profile' from source: include params 8238 1726882399.51460: variable 'item' from source: include params 8238 1726882399.51528: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:19 -0400 (0:00:00.135) 0:00:29.670 ****** 8238 1726882399.51577: entering _queue_task() for managed_node3/assert 8238 1726882399.52099: worker is 1 (out of 1 available) 8238 1726882399.52112: exiting _queue_task() for managed_node3/assert 8238 1726882399.52127: done queuing things up, now waiting for results queue to drain 8238 1726882399.52129: waiting for pending results... 8238 1726882399.52365: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 8238 1726882399.52467: in run() - task 0affc7ec-ae25-54bc-d334-00000000026f 8238 1726882399.52489: variable 'ansible_search_path' from source: unknown 8238 1726882399.52493: variable 'ansible_search_path' from source: unknown 8238 1726882399.52559: calling self._execute() 8238 1726882399.52682: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.52690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.52702: variable 'omit' from source: magic vars 8238 1726882399.53185: variable 'ansible_distribution_major_version' from source: facts 8238 1726882399.53209: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882399.53215: variable 'omit' from source: magic vars 8238 1726882399.53260: variable 'omit' from source: magic vars 8238 1726882399.53383: variable 'profile' from source: include params 8238 1726882399.53394: variable 'item' from source: include params 8238 1726882399.53469: variable 'item' from source: include params 8238 1726882399.53498: variable 'omit' from source: magic vars 8238 1726882399.53556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882399.53602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882399.53620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882399.53652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.53664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.53699: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882399.53703: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.53706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.54028: Set connection var ansible_connection to ssh 8238 1726882399.54032: Set connection var ansible_shell_type to sh 8238 1726882399.54034: Set connection var ansible_pipelining to False 8238 1726882399.54037: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882399.54039: Set connection var ansible_timeout to 10 8238 1726882399.54042: Set connection var ansible_shell_executable to /bin/sh 8238 1726882399.54044: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.54046: variable 'ansible_connection' from source: unknown 8238 1726882399.54048: variable 'ansible_module_compression' from source: unknown 8238 1726882399.54051: variable 'ansible_shell_type' from source: unknown 8238 1726882399.54055: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.54058: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.54060: variable 'ansible_pipelining' from source: unknown 8238 1726882399.54062: variable 'ansible_timeout' from source: unknown 8238 1726882399.54065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.54216: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882399.54220: variable 'omit' from source: magic vars 8238 1726882399.54227: starting attempt loop 8238 1726882399.54230: running the handler 8238 1726882399.54232: variable 'lsr_net_profile_ansible_managed' from source: set_fact 8238 1726882399.54235: Evaluated conditional (lsr_net_profile_ansible_managed): True 8238 1726882399.54239: handler run complete 8238 1726882399.54258: attempt loop complete, returning result 8238 1726882399.54261: _execute() done 8238 1726882399.54264: dumping result to json 8238 1726882399.54268: done dumping result, returning 8238 1726882399.54275: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0affc7ec-ae25-54bc-d334-00000000026f] 8238 1726882399.54281: sending task result for task 0affc7ec-ae25-54bc-d334-00000000026f 8238 1726882399.54706: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000026f 8238 1726882399.54710: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882399.54997: no more pending results, returning what we have 8238 1726882399.55000: results queue empty 8238 1726882399.55001: checking for any_errors_fatal 8238 1726882399.55005: done checking for any_errors_fatal 8238 1726882399.55006: checking for max_fail_percentage 8238 1726882399.55007: done checking for max_fail_percentage 8238 1726882399.55008: checking to see if all hosts have failed and the running result is not ok 8238 1726882399.55009: done checking to see if all hosts have failed 8238 1726882399.55010: getting the remaining hosts for this loop 8238 1726882399.55011: done getting the remaining hosts for this loop 8238 1726882399.55015: getting the next task for host managed_node3 8238 1726882399.55020: done getting next task for host managed_node3 8238 1726882399.55226: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 8238 1726882399.55229: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882399.55234: getting variables 8238 1726882399.55235: in VariableManager get_vars() 8238 1726882399.55271: Calling all_inventory to load vars for managed_node3 8238 1726882399.55274: Calling groups_inventory to load vars for managed_node3 8238 1726882399.55276: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882399.55287: Calling all_plugins_play to load vars for managed_node3 8238 1726882399.55290: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882399.55293: Calling groups_plugins_play to load vars for managed_node3 8238 1726882399.58536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882399.61300: done with get_vars() 8238 1726882399.61341: done getting variables 8238 1726882399.61418: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882399.61541: variable 'profile' from source: include params 8238 1726882399.61546: variable 'item' from source: include params 8238 1726882399.61616: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:19 -0400 (0:00:00.100) 0:00:29.771 ****** 8238 1726882399.61661: entering _queue_task() for managed_node3/assert 8238 1726882399.62060: worker is 1 (out of 1 available) 8238 1726882399.62076: exiting _queue_task() for managed_node3/assert 8238 1726882399.62090: done queuing things up, now waiting for results queue to drain 8238 1726882399.62092: waiting for pending results... 8238 1726882399.62377: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 8238 1726882399.62510: in run() - task 0affc7ec-ae25-54bc-d334-000000000270 8238 1726882399.62538: variable 'ansible_search_path' from source: unknown 8238 1726882399.62547: variable 'ansible_search_path' from source: unknown 8238 1726882399.62628: calling self._execute() 8238 1726882399.62811: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.63145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.63149: variable 'omit' from source: magic vars 8238 1726882399.63831: variable 'ansible_distribution_major_version' from source: facts 8238 1726882399.63848: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882399.63862: variable 'omit' from source: magic vars 8238 1726882399.63932: variable 'omit' from source: magic vars 8238 1726882399.64185: variable 'profile' from source: include params 8238 1726882399.64195: variable 'item' from source: include params 8238 1726882399.64336: variable 'item' from source: include params 8238 1726882399.64360: variable 'omit' from source: magic vars 8238 1726882399.64404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882399.64453: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882399.64477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882399.64500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.64515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.64556: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882399.64565: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.64572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.64702: Set connection var ansible_connection to ssh 8238 1726882399.64710: Set connection var ansible_shell_type to sh 8238 1726882399.64719: Set connection var ansible_pipelining to False 8238 1726882399.64731: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882399.64741: Set connection var ansible_timeout to 10 8238 1726882399.64753: Set connection var ansible_shell_executable to /bin/sh 8238 1726882399.64784: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.64791: variable 'ansible_connection' from source: unknown 8238 1726882399.64798: variable 'ansible_module_compression' from source: unknown 8238 1726882399.64804: variable 'ansible_shell_type' from source: unknown 8238 1726882399.64811: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.64818: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.64828: variable 'ansible_pipelining' from source: unknown 8238 1726882399.64880: variable 'ansible_timeout' from source: unknown 8238 1726882399.64883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.65000: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882399.65016: variable 'omit' from source: magic vars 8238 1726882399.65029: starting attempt loop 8238 1726882399.65035: running the handler 8238 1726882399.65157: variable 'lsr_net_profile_fingerprint' from source: set_fact 8238 1726882399.65168: Evaluated conditional (lsr_net_profile_fingerprint): True 8238 1726882399.65178: handler run complete 8238 1726882399.65204: attempt loop complete, returning result 8238 1726882399.65314: _execute() done 8238 1726882399.65317: dumping result to json 8238 1726882399.65320: done dumping result, returning 8238 1726882399.65324: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [0affc7ec-ae25-54bc-d334-000000000270] 8238 1726882399.65327: sending task result for task 0affc7ec-ae25-54bc-d334-000000000270 8238 1726882399.65398: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000270 8238 1726882399.65401: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8238 1726882399.65466: no more pending results, returning what we have 8238 1726882399.65470: results queue empty 8238 1726882399.65471: checking for any_errors_fatal 8238 1726882399.65478: done checking for any_errors_fatal 8238 1726882399.65479: checking for max_fail_percentage 8238 1726882399.65480: done checking for max_fail_percentage 8238 1726882399.65481: checking to see if all hosts have failed and the running result is not ok 8238 1726882399.65482: done checking to see if all hosts have failed 8238 1726882399.65483: getting the remaining hosts for this loop 8238 1726882399.65484: done getting the remaining hosts for this loop 8238 1726882399.65488: getting the next task for host managed_node3 8238 1726882399.65497: done getting next task for host managed_node3 8238 1726882399.65500: ^ task is: TASK: ** TEST check polling interval 8238 1726882399.65503: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882399.65507: getting variables 8238 1726882399.65508: in VariableManager get_vars() 8238 1726882399.65551: Calling all_inventory to load vars for managed_node3 8238 1726882399.65557: Calling groups_inventory to load vars for managed_node3 8238 1726882399.65560: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882399.65570: Calling all_plugins_play to load vars for managed_node3 8238 1726882399.65573: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882399.65575: Calling groups_plugins_play to load vars for managed_node3 8238 1726882399.67463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882399.69518: done with get_vars() 8238 1726882399.69547: done getting variables 8238 1726882399.69613: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Friday 20 September 2024 21:33:19 -0400 (0:00:00.079) 0:00:29.851 ****** 8238 1726882399.69645: entering _queue_task() for managed_node3/command 8238 1726882399.69978: worker is 1 (out of 1 available) 8238 1726882399.69999: exiting _queue_task() for managed_node3/command 8238 1726882399.70014: done queuing things up, now waiting for results queue to drain 8238 1726882399.70015: waiting for pending results... 8238 1726882399.70305: running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval 8238 1726882399.70442: in run() - task 0affc7ec-ae25-54bc-d334-000000000071 8238 1726882399.70446: variable 'ansible_search_path' from source: unknown 8238 1726882399.70552: calling self._execute() 8238 1726882399.70599: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.70611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.70628: variable 'omit' from source: magic vars 8238 1726882399.71045: variable 'ansible_distribution_major_version' from source: facts 8238 1726882399.71063: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882399.71074: variable 'omit' from source: magic vars 8238 1726882399.71106: variable 'omit' from source: magic vars 8238 1726882399.71224: variable 'controller_device' from source: play vars 8238 1726882399.71247: variable 'omit' from source: magic vars 8238 1726882399.71294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882399.71420: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882399.71426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882399.71428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.71431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882399.71441: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882399.71449: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.71456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.71574: Set connection var ansible_connection to ssh 8238 1726882399.71582: Set connection var ansible_shell_type to sh 8238 1726882399.71592: Set connection var ansible_pipelining to False 8238 1726882399.71601: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882399.71611: Set connection var ansible_timeout to 10 8238 1726882399.71626: Set connection var ansible_shell_executable to /bin/sh 8238 1726882399.71747: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.71751: variable 'ansible_connection' from source: unknown 8238 1726882399.71753: variable 'ansible_module_compression' from source: unknown 8238 1726882399.71757: variable 'ansible_shell_type' from source: unknown 8238 1726882399.71759: variable 'ansible_shell_executable' from source: unknown 8238 1726882399.71762: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882399.71764: variable 'ansible_pipelining' from source: unknown 8238 1726882399.71766: variable 'ansible_timeout' from source: unknown 8238 1726882399.71768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882399.71864: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882399.71885: variable 'omit' from source: magic vars 8238 1726882399.71894: starting attempt loop 8238 1726882399.71901: running the handler 8238 1726882399.71919: _low_level_execute_command(): starting 8238 1726882399.71964: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882399.72746: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882399.72845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882399.72871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882399.72995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882399.74789: stdout chunk (state=3): >>>/root <<< 8238 1726882399.74971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882399.75001: stdout chunk (state=3): >>><<< 8238 1726882399.75005: stderr chunk (state=3): >>><<< 8238 1726882399.75028: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882399.75048: _low_level_execute_command(): starting 8238 1726882399.75137: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571 `" && echo ansible-tmp-1726882399.7503521-9404-224826369324571="` echo /root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571 `" ) && sleep 0' 8238 1726882399.75713: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882399.75730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882399.75747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882399.75775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882399.75841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882399.75902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882399.75917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882399.75941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882399.76064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882399.78193: stdout chunk (state=3): >>>ansible-tmp-1726882399.7503521-9404-224826369324571=/root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571 <<< 8238 1726882399.78208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882399.78434: stderr chunk (state=3): >>><<< 8238 1726882399.78437: stdout chunk (state=3): >>><<< 8238 1726882399.78440: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882399.7503521-9404-224826369324571=/root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882399.78443: variable 'ansible_module_compression' from source: unknown 8238 1726882399.78445: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882399.78447: variable 'ansible_facts' from source: unknown 8238 1726882399.78510: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/AnsiballZ_command.py 8238 1726882399.78658: Sending initial data 8238 1726882399.78661: Sent initial data (154 bytes) 8238 1726882399.79387: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882399.79644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882399.79758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882399.81441: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 8238 1726882399.81445: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882399.81523: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882399.81628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpugkcicdf /root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/AnsiballZ_command.py <<< 8238 1726882399.81656: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/AnsiballZ_command.py" <<< 8238 1726882399.81753: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpugkcicdf" to remote "/root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/AnsiballZ_command.py" <<< 8238 1726882399.82868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882399.82872: stdout chunk (state=3): >>><<< 8238 1726882399.82874: stderr chunk (state=3): >>><<< 8238 1726882399.82876: done transferring module to remote 8238 1726882399.82879: _low_level_execute_command(): starting 8238 1726882399.82881: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/ /root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/AnsiballZ_command.py && sleep 0' 8238 1726882399.83439: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882399.83459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882399.83475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882399.83492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882399.83538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882399.83609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882399.83637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882399.83668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882399.83759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882399.85827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882399.85834: stdout chunk (state=3): >>><<< 8238 1726882399.85837: stderr chunk (state=3): >>><<< 8238 1726882399.85840: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882399.85842: _low_level_execute_command(): starting 8238 1726882399.85845: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/AnsiballZ_command.py && sleep 0' 8238 1726882399.86535: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882399.86544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882399.86555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882399.86574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882399.86586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882399.86593: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882399.86798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882399.86802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882399.86804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882399.86807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882399.86902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.03950: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:33:20.034172", "end": "2024-09-20 21:33:20.037822", "delta": "0:00:00.003650", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882400.05542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882400.05598: stderr chunk (state=3): >>><<< 8238 1726882400.05601: stdout chunk (state=3): >>><<< 8238 1726882400.05620: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:33:20.034172", "end": "2024-09-20 21:33:20.037822", "delta": "0:00:00.003650", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882400.05655: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882400.05661: _low_level_execute_command(): starting 8238 1726882400.05666: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882399.7503521-9404-224826369324571/ > /dev/null 2>&1 && sleep 0' 8238 1726882400.06101: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.06105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882400.06131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882400.06135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.06138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.06156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.06203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.06206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.06296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.08192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.08236: stderr chunk (state=3): >>><<< 8238 1726882400.08240: stdout chunk (state=3): >>><<< 8238 1726882400.08252: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882400.08261: handler run complete 8238 1726882400.08281: Evaluated conditional (False): False 8238 1726882400.08399: variable 'result' from source: unknown 8238 1726882400.08414: Evaluated conditional ('110' in result.stdout): True 8238 1726882400.08426: attempt loop complete, returning result 8238 1726882400.08429: _execute() done 8238 1726882400.08435: dumping result to json 8238 1726882400.08440: done dumping result, returning 8238 1726882400.08447: done running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval [0affc7ec-ae25-54bc-d334-000000000071] 8238 1726882400.08453: sending task result for task 0affc7ec-ae25-54bc-d334-000000000071 8238 1726882400.08560: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000071 8238 1726882400.08563: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003650", "end": "2024-09-20 21:33:20.037822", "rc": 0, "start": "2024-09-20 21:33:20.034172" } STDOUT: MII Polling Interval (ms): 110 8238 1726882400.08640: no more pending results, returning what we have 8238 1726882400.08643: results queue empty 8238 1726882400.08645: checking for any_errors_fatal 8238 1726882400.08651: done checking for any_errors_fatal 8238 1726882400.08652: checking for max_fail_percentage 8238 1726882400.08653: done checking for max_fail_percentage 8238 1726882400.08654: checking to see if all hosts have failed and the running result is not ok 8238 1726882400.08655: done checking to see if all hosts have failed 8238 1726882400.08656: getting the remaining hosts for this loop 8238 1726882400.08657: done getting the remaining hosts for this loop 8238 1726882400.08662: getting the next task for host managed_node3 8238 1726882400.08667: done getting next task for host managed_node3 8238 1726882400.08670: ^ task is: TASK: ** TEST check IPv4 8238 1726882400.08673: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882400.08677: getting variables 8238 1726882400.08678: in VariableManager get_vars() 8238 1726882400.08717: Calling all_inventory to load vars for managed_node3 8238 1726882400.08719: Calling groups_inventory to load vars for managed_node3 8238 1726882400.08728: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882400.08739: Calling all_plugins_play to load vars for managed_node3 8238 1726882400.08742: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882400.08745: Calling groups_plugins_play to load vars for managed_node3 8238 1726882400.09711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882400.10928: done with get_vars() 8238 1726882400.10946: done getting variables 8238 1726882400.10992: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Friday 20 September 2024 21:33:20 -0400 (0:00:00.413) 0:00:30.265 ****** 8238 1726882400.11016: entering _queue_task() for managed_node3/command 8238 1726882400.11247: worker is 1 (out of 1 available) 8238 1726882400.11262: exiting _queue_task() for managed_node3/command 8238 1726882400.11274: done queuing things up, now waiting for results queue to drain 8238 1726882400.11276: waiting for pending results... 8238 1726882400.11453: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 8238 1726882400.11519: in run() - task 0affc7ec-ae25-54bc-d334-000000000072 8238 1726882400.11534: variable 'ansible_search_path' from source: unknown 8238 1726882400.11568: calling self._execute() 8238 1726882400.11650: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882400.11659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882400.11668: variable 'omit' from source: magic vars 8238 1726882400.11951: variable 'ansible_distribution_major_version' from source: facts 8238 1726882400.11964: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882400.11969: variable 'omit' from source: magic vars 8238 1726882400.11989: variable 'omit' from source: magic vars 8238 1726882400.12064: variable 'controller_device' from source: play vars 8238 1726882400.12078: variable 'omit' from source: magic vars 8238 1726882400.12113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882400.12144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882400.12165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882400.12180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882400.12189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882400.12215: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882400.12218: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882400.12221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882400.12307: Set connection var ansible_connection to ssh 8238 1726882400.12311: Set connection var ansible_shell_type to sh 8238 1726882400.12313: Set connection var ansible_pipelining to False 8238 1726882400.12319: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882400.12327: Set connection var ansible_timeout to 10 8238 1726882400.12334: Set connection var ansible_shell_executable to /bin/sh 8238 1726882400.12351: variable 'ansible_shell_executable' from source: unknown 8238 1726882400.12354: variable 'ansible_connection' from source: unknown 8238 1726882400.12360: variable 'ansible_module_compression' from source: unknown 8238 1726882400.12362: variable 'ansible_shell_type' from source: unknown 8238 1726882400.12365: variable 'ansible_shell_executable' from source: unknown 8238 1726882400.12371: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882400.12373: variable 'ansible_pipelining' from source: unknown 8238 1726882400.12375: variable 'ansible_timeout' from source: unknown 8238 1726882400.12383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882400.12492: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882400.12504: variable 'omit' from source: magic vars 8238 1726882400.12507: starting attempt loop 8238 1726882400.12509: running the handler 8238 1726882400.12526: _low_level_execute_command(): starting 8238 1726882400.12532: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882400.13074: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.13077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.13081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882400.13085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.13131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.13138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882400.13140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.13225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.14895: stdout chunk (state=3): >>>/root <<< 8238 1726882400.14998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.15049: stderr chunk (state=3): >>><<< 8238 1726882400.15052: stdout chunk (state=3): >>><<< 8238 1726882400.15071: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882400.15082: _low_level_execute_command(): starting 8238 1726882400.15088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282 `" && echo ansible-tmp-1726882400.1507092-9420-5340213394282="` echo /root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282 `" ) && sleep 0' 8238 1726882400.15518: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.15549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882400.15552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882400.15555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.15564: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.15567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.15614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.15617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.15709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.17708: stdout chunk (state=3): >>>ansible-tmp-1726882400.1507092-9420-5340213394282=/root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282 <<< 8238 1726882400.17825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.17868: stderr chunk (state=3): >>><<< 8238 1726882400.17872: stdout chunk (state=3): >>><<< 8238 1726882400.17886: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882400.1507092-9420-5340213394282=/root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882400.17909: variable 'ansible_module_compression' from source: unknown 8238 1726882400.17947: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882400.17979: variable 'ansible_facts' from source: unknown 8238 1726882400.18034: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/AnsiballZ_command.py 8238 1726882400.18130: Sending initial data 8238 1726882400.18134: Sent initial data (152 bytes) 8238 1726882400.18572: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882400.18576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882400.18578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.18580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.18637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.18641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.18719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.20365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882400.20453: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882400.20551: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp12xb4uxd /root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/AnsiballZ_command.py <<< 8238 1726882400.20555: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/AnsiballZ_command.py" <<< 8238 1726882400.20639: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp12xb4uxd" to remote "/root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/AnsiballZ_command.py" <<< 8238 1726882400.21677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.21680: stdout chunk (state=3): >>><<< 8238 1726882400.21683: stderr chunk (state=3): >>><<< 8238 1726882400.21685: done transferring module to remote 8238 1726882400.21687: _low_level_execute_command(): starting 8238 1726882400.21689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/ /root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/AnsiballZ_command.py && sleep 0' 8238 1726882400.22241: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882400.22262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882400.22276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.22294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882400.22381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.22416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.22435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882400.22456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.22573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.24805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.24810: stdout chunk (state=3): >>><<< 8238 1726882400.24813: stderr chunk (state=3): >>><<< 8238 1726882400.24815: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882400.24818: _low_level_execute_command(): starting 8238 1726882400.24820: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/AnsiballZ_command.py && sleep 0' 8238 1726882400.25604: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882400.25620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882400.25638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.25657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882400.25673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882400.25684: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882400.25696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.25714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882400.25731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882400.25743: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8238 1726882400.25755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882400.25769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.25855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882400.25877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.25994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.42972: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.123/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 232sec preferred_lft 232sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:20.423998", "end": "2024-09-20 21:33:20.427909", "delta": "0:00:00.003911", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882400.44729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882400.44733: stdout chunk (state=3): >>><<< 8238 1726882400.44735: stderr chunk (state=3): >>><<< 8238 1726882400.44738: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.123/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 232sec preferred_lft 232sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:20.423998", "end": "2024-09-20 21:33:20.427909", "delta": "0:00:00.003911", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882400.44741: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882400.44745: _low_level_execute_command(): starting 8238 1726882400.44747: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882400.1507092-9420-5340213394282/ > /dev/null 2>&1 && sleep 0' 8238 1726882400.45619: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.45694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.45709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882400.45741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.46011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.47835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.47902: stderr chunk (state=3): >>><<< 8238 1726882400.47918: stdout chunk (state=3): >>><<< 8238 1726882400.47942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882400.47953: handler run complete 8238 1726882400.47983: Evaluated conditional (False): False 8238 1726882400.48160: variable 'result' from source: set_fact 8238 1726882400.48186: Evaluated conditional ('192.0.2' in result.stdout): True 8238 1726882400.48203: attempt loop complete, returning result 8238 1726882400.48210: _execute() done 8238 1726882400.48217: dumping result to json 8238 1726882400.48230: done dumping result, returning 8238 1726882400.48247: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [0affc7ec-ae25-54bc-d334-000000000072] 8238 1726882400.48257: sending task result for task 0affc7ec-ae25-54bc-d334-000000000072 ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003911", "end": "2024-09-20 21:33:20.427909", "rc": 0, "start": "2024-09-20 21:33:20.423998" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.123/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 232sec preferred_lft 232sec 8238 1726882400.48509: no more pending results, returning what we have 8238 1726882400.48513: results queue empty 8238 1726882400.48515: checking for any_errors_fatal 8238 1726882400.48525: done checking for any_errors_fatal 8238 1726882400.48526: checking for max_fail_percentage 8238 1726882400.48527: done checking for max_fail_percentage 8238 1726882400.48528: checking to see if all hosts have failed and the running result is not ok 8238 1726882400.48530: done checking to see if all hosts have failed 8238 1726882400.48531: getting the remaining hosts for this loop 8238 1726882400.48532: done getting the remaining hosts for this loop 8238 1726882400.48537: getting the next task for host managed_node3 8238 1726882400.48544: done getting next task for host managed_node3 8238 1726882400.48547: ^ task is: TASK: ** TEST check IPv6 8238 1726882400.48550: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882400.48554: getting variables 8238 1726882400.48556: in VariableManager get_vars() 8238 1726882400.48604: Calling all_inventory to load vars for managed_node3 8238 1726882400.48608: Calling groups_inventory to load vars for managed_node3 8238 1726882400.48610: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882400.48827: Calling all_plugins_play to load vars for managed_node3 8238 1726882400.48831: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882400.48837: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000072 8238 1726882400.48840: WORKER PROCESS EXITING 8238 1726882400.48848: Calling groups_plugins_play to load vars for managed_node3 8238 1726882400.50892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882400.52556: done with get_vars() 8238 1726882400.52576: done getting variables 8238 1726882400.52625: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Friday 20 September 2024 21:33:20 -0400 (0:00:00.416) 0:00:30.681 ****** 8238 1726882400.52648: entering _queue_task() for managed_node3/command 8238 1726882400.52904: worker is 1 (out of 1 available) 8238 1726882400.52919: exiting _queue_task() for managed_node3/command 8238 1726882400.52933: done queuing things up, now waiting for results queue to drain 8238 1726882400.52935: waiting for pending results... 8238 1726882400.53115: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 8238 1726882400.53181: in run() - task 0affc7ec-ae25-54bc-d334-000000000073 8238 1726882400.53195: variable 'ansible_search_path' from source: unknown 8238 1726882400.53230: calling self._execute() 8238 1726882400.53316: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882400.53323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882400.53332: variable 'omit' from source: magic vars 8238 1726882400.53625: variable 'ansible_distribution_major_version' from source: facts 8238 1726882400.53636: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882400.53642: variable 'omit' from source: magic vars 8238 1726882400.53663: variable 'omit' from source: magic vars 8238 1726882400.53737: variable 'controller_device' from source: play vars 8238 1726882400.53751: variable 'omit' from source: magic vars 8238 1726882400.53789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882400.53818: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882400.53837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882400.53851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882400.53869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882400.53902: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882400.53905: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882400.53909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882400.54012: Set connection var ansible_connection to ssh 8238 1726882400.54016: Set connection var ansible_shell_type to sh 8238 1726882400.54019: Set connection var ansible_pipelining to False 8238 1726882400.54021: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882400.54026: Set connection var ansible_timeout to 10 8238 1726882400.54028: Set connection var ansible_shell_executable to /bin/sh 8238 1726882400.54151: variable 'ansible_shell_executable' from source: unknown 8238 1726882400.54154: variable 'ansible_connection' from source: unknown 8238 1726882400.54156: variable 'ansible_module_compression' from source: unknown 8238 1726882400.54159: variable 'ansible_shell_type' from source: unknown 8238 1726882400.54161: variable 'ansible_shell_executable' from source: unknown 8238 1726882400.54163: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882400.54165: variable 'ansible_pipelining' from source: unknown 8238 1726882400.54168: variable 'ansible_timeout' from source: unknown 8238 1726882400.54170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882400.54259: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882400.54263: variable 'omit' from source: magic vars 8238 1726882400.54266: starting attempt loop 8238 1726882400.54268: running the handler 8238 1726882400.54270: _low_level_execute_command(): starting 8238 1726882400.54272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882400.54982: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882400.54985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882400.54989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.54992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882400.54994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882400.54997: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882400.54999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.55016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882400.55029: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882400.55035: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8238 1726882400.55044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882400.55081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.55085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882400.55087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882400.55090: stderr chunk (state=3): >>>debug2: match found <<< 8238 1726882400.55093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.55173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.55177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882400.55195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.55302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.56940: stdout chunk (state=3): >>>/root <<< 8238 1726882400.57094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.57097: stdout chunk (state=3): >>><<< 8238 1726882400.57105: stderr chunk (state=3): >>><<< 8238 1726882400.57133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882400.57137: _low_level_execute_command(): starting 8238 1726882400.57144: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627 `" && echo ansible-tmp-1726882400.57124-9442-237243936559627="` echo /root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627 `" ) && sleep 0' 8238 1726882400.57795: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.57800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.57803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.57814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882400.57898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.57933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.59899: stdout chunk (state=3): >>>ansible-tmp-1726882400.57124-9442-237243936559627=/root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627 <<< 8238 1726882400.60022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.60080: stderr chunk (state=3): >>><<< 8238 1726882400.60083: stdout chunk (state=3): >>><<< 8238 1726882400.60099: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882400.57124-9442-237243936559627=/root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882400.60146: variable 'ansible_module_compression' from source: unknown 8238 1726882400.60207: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882400.60247: variable 'ansible_facts' from source: unknown 8238 1726882400.60429: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/AnsiballZ_command.py 8238 1726882400.60529: Sending initial data 8238 1726882400.60532: Sent initial data (152 bytes) 8238 1726882400.61026: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882400.61035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882400.61046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.61138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.61150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.61168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882400.61179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.61284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.62854: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8238 1726882400.62875: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 8238 1726882400.62877: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882400.62958: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882400.63041: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp7wqllp8n /root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/AnsiballZ_command.py <<< 8238 1726882400.63046: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/AnsiballZ_command.py" <<< 8238 1726882400.63123: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp7wqllp8n" to remote "/root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/AnsiballZ_command.py" <<< 8238 1726882400.64128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.64132: stderr chunk (state=3): >>><<< 8238 1726882400.64135: stdout chunk (state=3): >>><<< 8238 1726882400.64137: done transferring module to remote 8238 1726882400.64139: _low_level_execute_command(): starting 8238 1726882400.64142: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/ /root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/AnsiballZ_command.py && sleep 0' 8238 1726882400.64553: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.64565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.64589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.64630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.64643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.64730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.66535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.66581: stderr chunk (state=3): >>><<< 8238 1726882400.66584: stdout chunk (state=3): >>><<< 8238 1726882400.66597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882400.66601: _low_level_execute_command(): starting 8238 1726882400.66604: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/AnsiballZ_command.py && sleep 0' 8238 1726882400.67017: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.67049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882400.67053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.67055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882400.67057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882400.67061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.67110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882400.67114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.67212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.83799: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::199/128 scope global dynamic noprefixroute \n valid_lft 234sec preferred_lft 234sec\n inet6 2001:db8::1803:21ff:fedf:c4df/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::1803:21ff:fedf:c4df/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:20.832613", "end": "2024-09-20 21:33:20.836415", "delta": "0:00:00.003802", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882400.85248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882400.85308: stderr chunk (state=3): >>><<< 8238 1726882400.85311: stdout chunk (state=3): >>><<< 8238 1726882400.85329: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::199/128 scope global dynamic noprefixroute \n valid_lft 234sec preferred_lft 234sec\n inet6 2001:db8::1803:21ff:fedf:c4df/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::1803:21ff:fedf:c4df/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:20.832613", "end": "2024-09-20 21:33:20.836415", "delta": "0:00:00.003802", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882400.85361: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882400.85370: _low_level_execute_command(): starting 8238 1726882400.85376: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882400.57124-9442-237243936559627/ > /dev/null 2>&1 && sleep 0' 8238 1726882400.86001: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882400.86357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882400.86400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882400.86487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882400.88528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882400.88532: stdout chunk (state=3): >>><<< 8238 1726882400.88535: stderr chunk (state=3): >>><<< 8238 1726882400.88539: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882400.88542: handler run complete 8238 1726882400.88545: Evaluated conditional (False): False 8238 1726882400.88711: variable 'result' from source: set_fact 8238 1726882400.88731: Evaluated conditional ('2001' in result.stdout): True 8238 1726882400.88745: attempt loop complete, returning result 8238 1726882400.88748: _execute() done 8238 1726882400.88751: dumping result to json 8238 1726882400.88773: done dumping result, returning 8238 1726882400.88782: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [0affc7ec-ae25-54bc-d334-000000000073] 8238 1726882400.88788: sending task result for task 0affc7ec-ae25-54bc-d334-000000000073 ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003802", "end": "2024-09-20 21:33:20.836415", "rc": 0, "start": "2024-09-20 21:33:20.832613" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::199/128 scope global dynamic noprefixroute valid_lft 234sec preferred_lft 234sec inet6 2001:db8::1803:21ff:fedf:c4df/64 scope global dynamic noprefixroute valid_lft 1794sec preferred_lft 1794sec inet6 fe80::1803:21ff:fedf:c4df/64 scope link noprefixroute valid_lft forever preferred_lft forever 8238 1726882400.89096: no more pending results, returning what we have 8238 1726882400.89099: results queue empty 8238 1726882400.89100: checking for any_errors_fatal 8238 1726882400.89109: done checking for any_errors_fatal 8238 1726882400.89110: checking for max_fail_percentage 8238 1726882400.89111: done checking for max_fail_percentage 8238 1726882400.89112: checking to see if all hosts have failed and the running result is not ok 8238 1726882400.89113: done checking to see if all hosts have failed 8238 1726882400.89114: getting the remaining hosts for this loop 8238 1726882400.89115: done getting the remaining hosts for this loop 8238 1726882400.89119: getting the next task for host managed_node3 8238 1726882400.89336: done getting next task for host managed_node3 8238 1726882400.89341: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8238 1726882400.89345: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882400.89367: getting variables 8238 1726882400.89369: in VariableManager get_vars() 8238 1726882400.89412: Calling all_inventory to load vars for managed_node3 8238 1726882400.89415: Calling groups_inventory to load vars for managed_node3 8238 1726882400.89418: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882400.89435: Calling all_plugins_play to load vars for managed_node3 8238 1726882400.89438: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882400.89442: Calling groups_plugins_play to load vars for managed_node3 8238 1726882400.90078: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000073 8238 1726882400.90082: WORKER PROCESS EXITING 8238 1726882400.95682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882400.97694: done with get_vars() 8238 1726882400.97729: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:20 -0400 (0:00:00.451) 0:00:31.133 ****** 8238 1726882400.97823: entering _queue_task() for managed_node3/include_tasks 8238 1726882400.98200: worker is 1 (out of 1 available) 8238 1726882400.98216: exiting _queue_task() for managed_node3/include_tasks 8238 1726882400.98431: done queuing things up, now waiting for results queue to drain 8238 1726882400.98434: waiting for pending results... 8238 1726882400.98563: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8238 1726882400.98768: in run() - task 0affc7ec-ae25-54bc-d334-00000000007c 8238 1726882400.98772: variable 'ansible_search_path' from source: unknown 8238 1726882400.98776: variable 'ansible_search_path' from source: unknown 8238 1726882400.98792: calling self._execute() 8238 1726882400.98913: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882400.98920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882400.98937: variable 'omit' from source: magic vars 8238 1726882400.99271: variable 'ansible_distribution_major_version' from source: facts 8238 1726882400.99281: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882400.99287: _execute() done 8238 1726882400.99291: dumping result to json 8238 1726882400.99294: done dumping result, returning 8238 1726882400.99304: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-54bc-d334-00000000007c] 8238 1726882400.99306: sending task result for task 0affc7ec-ae25-54bc-d334-00000000007c 8238 1726882400.99405: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000007c 8238 1726882400.99407: WORKER PROCESS EXITING 8238 1726882400.99460: no more pending results, returning what we have 8238 1726882400.99465: in VariableManager get_vars() 8238 1726882400.99515: Calling all_inventory to load vars for managed_node3 8238 1726882400.99519: Calling groups_inventory to load vars for managed_node3 8238 1726882400.99521: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882400.99533: Calling all_plugins_play to load vars for managed_node3 8238 1726882400.99536: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882400.99538: Calling groups_plugins_play to load vars for managed_node3 8238 1726882401.00471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882401.02143: done with get_vars() 8238 1726882401.02159: variable 'ansible_search_path' from source: unknown 8238 1726882401.02160: variable 'ansible_search_path' from source: unknown 8238 1726882401.02188: we have included files to process 8238 1726882401.02189: generating all_blocks data 8238 1726882401.02191: done generating all_blocks data 8238 1726882401.02197: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8238 1726882401.02197: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8238 1726882401.02199: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8238 1726882401.02605: done processing included file 8238 1726882401.02606: iterating over new_blocks loaded from include file 8238 1726882401.02607: in VariableManager get_vars() 8238 1726882401.02626: done with get_vars() 8238 1726882401.02627: filtering new block on tags 8238 1726882401.02648: done filtering new block on tags 8238 1726882401.02650: in VariableManager get_vars() 8238 1726882401.02668: done with get_vars() 8238 1726882401.02670: filtering new block on tags 8238 1726882401.02697: done filtering new block on tags 8238 1726882401.02699: in VariableManager get_vars() 8238 1726882401.02714: done with get_vars() 8238 1726882401.02715: filtering new block on tags 8238 1726882401.02742: done filtering new block on tags 8238 1726882401.02743: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 8238 1726882401.02747: extending task lists for all hosts with included blocks 8238 1726882401.03493: done extending task lists 8238 1726882401.03495: done processing included files 8238 1726882401.03495: results queue empty 8238 1726882401.03496: checking for any_errors_fatal 8238 1726882401.03499: done checking for any_errors_fatal 8238 1726882401.03500: checking for max_fail_percentage 8238 1726882401.03501: done checking for max_fail_percentage 8238 1726882401.03501: checking to see if all hosts have failed and the running result is not ok 8238 1726882401.03502: done checking to see if all hosts have failed 8238 1726882401.03503: getting the remaining hosts for this loop 8238 1726882401.03503: done getting the remaining hosts for this loop 8238 1726882401.03505: getting the next task for host managed_node3 8238 1726882401.03508: done getting next task for host managed_node3 8238 1726882401.03510: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8238 1726882401.03512: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882401.03519: getting variables 8238 1726882401.03519: in VariableManager get_vars() 8238 1726882401.03532: Calling all_inventory to load vars for managed_node3 8238 1726882401.03534: Calling groups_inventory to load vars for managed_node3 8238 1726882401.03536: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882401.03540: Calling all_plugins_play to load vars for managed_node3 8238 1726882401.03542: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882401.03544: Calling groups_plugins_play to load vars for managed_node3 8238 1726882401.04695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882401.06126: done with get_vars() 8238 1726882401.06142: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:21 -0400 (0:00:00.083) 0:00:31.217 ****** 8238 1726882401.06197: entering _queue_task() for managed_node3/setup 8238 1726882401.06453: worker is 1 (out of 1 available) 8238 1726882401.06468: exiting _queue_task() for managed_node3/setup 8238 1726882401.06481: done queuing things up, now waiting for results queue to drain 8238 1726882401.06483: waiting for pending results... 8238 1726882401.06672: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8238 1726882401.06783: in run() - task 0affc7ec-ae25-54bc-d334-000000000491 8238 1726882401.06794: variable 'ansible_search_path' from source: unknown 8238 1726882401.06798: variable 'ansible_search_path' from source: unknown 8238 1726882401.06834: calling self._execute() 8238 1726882401.06914: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882401.06918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882401.06937: variable 'omit' from source: magic vars 8238 1726882401.07534: variable 'ansible_distribution_major_version' from source: facts 8238 1726882401.07538: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882401.07664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882401.09762: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882401.09916: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882401.09955: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882401.09978: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882401.10000: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882401.10068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882401.10090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882401.10108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882401.10138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882401.10151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882401.10194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882401.10211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882401.10231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882401.10259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882401.10271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882401.10385: variable '__network_required_facts' from source: role '' defaults 8238 1726882401.10393: variable 'ansible_facts' from source: unknown 8238 1726882401.10909: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8238 1726882401.10913: when evaluation is False, skipping this task 8238 1726882401.10918: _execute() done 8238 1726882401.10920: dumping result to json 8238 1726882401.10923: done dumping result, returning 8238 1726882401.10935: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-54bc-d334-000000000491] 8238 1726882401.10938: sending task result for task 0affc7ec-ae25-54bc-d334-000000000491 8238 1726882401.11055: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000491 8238 1726882401.11058: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882401.11205: no more pending results, returning what we have 8238 1726882401.11210: results queue empty 8238 1726882401.11211: checking for any_errors_fatal 8238 1726882401.11212: done checking for any_errors_fatal 8238 1726882401.11213: checking for max_fail_percentage 8238 1726882401.11214: done checking for max_fail_percentage 8238 1726882401.11215: checking to see if all hosts have failed and the running result is not ok 8238 1726882401.11216: done checking to see if all hosts have failed 8238 1726882401.11217: getting the remaining hosts for this loop 8238 1726882401.11218: done getting the remaining hosts for this loop 8238 1726882401.11223: getting the next task for host managed_node3 8238 1726882401.11233: done getting next task for host managed_node3 8238 1726882401.11237: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 8238 1726882401.11242: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882401.11260: getting variables 8238 1726882401.11262: in VariableManager get_vars() 8238 1726882401.11299: Calling all_inventory to load vars for managed_node3 8238 1726882401.11302: Calling groups_inventory to load vars for managed_node3 8238 1726882401.11304: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882401.11312: Calling all_plugins_play to load vars for managed_node3 8238 1726882401.11315: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882401.11318: Calling groups_plugins_play to load vars for managed_node3 8238 1726882401.12781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882401.13936: done with get_vars() 8238 1726882401.13955: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:21 -0400 (0:00:00.078) 0:00:31.295 ****** 8238 1726882401.14041: entering _queue_task() for managed_node3/stat 8238 1726882401.14306: worker is 1 (out of 1 available) 8238 1726882401.14324: exiting _queue_task() for managed_node3/stat 8238 1726882401.14335: done queuing things up, now waiting for results queue to drain 8238 1726882401.14337: waiting for pending results... 8238 1726882401.14530: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 8238 1726882401.14649: in run() - task 0affc7ec-ae25-54bc-d334-000000000493 8238 1726882401.14664: variable 'ansible_search_path' from source: unknown 8238 1726882401.14669: variable 'ansible_search_path' from source: unknown 8238 1726882401.14699: calling self._execute() 8238 1726882401.14779: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882401.14784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882401.14795: variable 'omit' from source: magic vars 8238 1726882401.15084: variable 'ansible_distribution_major_version' from source: facts 8238 1726882401.15095: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882401.15219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882401.15423: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882401.15459: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882401.15486: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882401.15512: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882401.15612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882401.15634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882401.15657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882401.15678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882401.15745: variable '__network_is_ostree' from source: set_fact 8238 1726882401.15748: Evaluated conditional (not __network_is_ostree is defined): False 8238 1726882401.15751: when evaluation is False, skipping this task 8238 1726882401.15757: _execute() done 8238 1726882401.15760: dumping result to json 8238 1726882401.15762: done dumping result, returning 8238 1726882401.15770: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-54bc-d334-000000000493] 8238 1726882401.15773: sending task result for task 0affc7ec-ae25-54bc-d334-000000000493 8238 1726882401.15863: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000493 8238 1726882401.15866: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8238 1726882401.15935: no more pending results, returning what we have 8238 1726882401.15939: results queue empty 8238 1726882401.15940: checking for any_errors_fatal 8238 1726882401.15948: done checking for any_errors_fatal 8238 1726882401.15948: checking for max_fail_percentage 8238 1726882401.15950: done checking for max_fail_percentage 8238 1726882401.15951: checking to see if all hosts have failed and the running result is not ok 8238 1726882401.15955: done checking to see if all hosts have failed 8238 1726882401.15956: getting the remaining hosts for this loop 8238 1726882401.15957: done getting the remaining hosts for this loop 8238 1726882401.15961: getting the next task for host managed_node3 8238 1726882401.15968: done getting next task for host managed_node3 8238 1726882401.15972: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8238 1726882401.15977: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882401.15995: getting variables 8238 1726882401.15997: in VariableManager get_vars() 8238 1726882401.16033: Calling all_inventory to load vars for managed_node3 8238 1726882401.16036: Calling groups_inventory to load vars for managed_node3 8238 1726882401.16038: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882401.16046: Calling all_plugins_play to load vars for managed_node3 8238 1726882401.16048: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882401.16051: Calling groups_plugins_play to load vars for managed_node3 8238 1726882401.16992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882401.18250: done with get_vars() 8238 1726882401.18271: done getting variables 8238 1726882401.18314: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:21 -0400 (0:00:00.042) 0:00:31.338 ****** 8238 1726882401.18342: entering _queue_task() for managed_node3/set_fact 8238 1726882401.18573: worker is 1 (out of 1 available) 8238 1726882401.18587: exiting _queue_task() for managed_node3/set_fact 8238 1726882401.18598: done queuing things up, now waiting for results queue to drain 8238 1726882401.18600: waiting for pending results... 8238 1726882401.18770: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8238 1726882401.18883: in run() - task 0affc7ec-ae25-54bc-d334-000000000494 8238 1726882401.18895: variable 'ansible_search_path' from source: unknown 8238 1726882401.18898: variable 'ansible_search_path' from source: unknown 8238 1726882401.18930: calling self._execute() 8238 1726882401.19003: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882401.19007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882401.19016: variable 'omit' from source: magic vars 8238 1726882401.19297: variable 'ansible_distribution_major_version' from source: facts 8238 1726882401.19307: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882401.19428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882401.19623: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882401.19659: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882401.19684: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882401.19715: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882401.19809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882401.19831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882401.19850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882401.19871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882401.19935: variable '__network_is_ostree' from source: set_fact 8238 1726882401.19941: Evaluated conditional (not __network_is_ostree is defined): False 8238 1726882401.19945: when evaluation is False, skipping this task 8238 1726882401.19947: _execute() done 8238 1726882401.19952: dumping result to json 8238 1726882401.19958: done dumping result, returning 8238 1726882401.19962: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-54bc-d334-000000000494] 8238 1726882401.19968: sending task result for task 0affc7ec-ae25-54bc-d334-000000000494 8238 1726882401.20058: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000494 8238 1726882401.20061: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8238 1726882401.20105: no more pending results, returning what we have 8238 1726882401.20108: results queue empty 8238 1726882401.20109: checking for any_errors_fatal 8238 1726882401.20114: done checking for any_errors_fatal 8238 1726882401.20114: checking for max_fail_percentage 8238 1726882401.20116: done checking for max_fail_percentage 8238 1726882401.20117: checking to see if all hosts have failed and the running result is not ok 8238 1726882401.20118: done checking to see if all hosts have failed 8238 1726882401.20118: getting the remaining hosts for this loop 8238 1726882401.20120: done getting the remaining hosts for this loop 8238 1726882401.20125: getting the next task for host managed_node3 8238 1726882401.20134: done getting next task for host managed_node3 8238 1726882401.20137: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 8238 1726882401.20142: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882401.20159: getting variables 8238 1726882401.20161: in VariableManager get_vars() 8238 1726882401.20194: Calling all_inventory to load vars for managed_node3 8238 1726882401.20197: Calling groups_inventory to load vars for managed_node3 8238 1726882401.20199: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882401.20207: Calling all_plugins_play to load vars for managed_node3 8238 1726882401.20210: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882401.20212: Calling groups_plugins_play to load vars for managed_node3 8238 1726882401.21140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882401.22309: done with get_vars() 8238 1726882401.22327: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:21 -0400 (0:00:00.040) 0:00:31.379 ****** 8238 1726882401.22397: entering _queue_task() for managed_node3/service_facts 8238 1726882401.22615: worker is 1 (out of 1 available) 8238 1726882401.22629: exiting _queue_task() for managed_node3/service_facts 8238 1726882401.22643: done queuing things up, now waiting for results queue to drain 8238 1726882401.22645: waiting for pending results... 8238 1726882401.22818: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 8238 1726882401.22927: in run() - task 0affc7ec-ae25-54bc-d334-000000000496 8238 1726882401.22939: variable 'ansible_search_path' from source: unknown 8238 1726882401.22942: variable 'ansible_search_path' from source: unknown 8238 1726882401.22972: calling self._execute() 8238 1726882401.23044: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882401.23048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882401.23058: variable 'omit' from source: magic vars 8238 1726882401.23333: variable 'ansible_distribution_major_version' from source: facts 8238 1726882401.23342: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882401.23348: variable 'omit' from source: magic vars 8238 1726882401.23422: variable 'omit' from source: magic vars 8238 1726882401.23455: variable 'omit' from source: magic vars 8238 1726882401.23485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882401.23512: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882401.23530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882401.23550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882401.23559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882401.23584: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882401.23587: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882401.23590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882401.23670: Set connection var ansible_connection to ssh 8238 1726882401.23674: Set connection var ansible_shell_type to sh 8238 1726882401.23677: Set connection var ansible_pipelining to False 8238 1726882401.23683: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882401.23689: Set connection var ansible_timeout to 10 8238 1726882401.23696: Set connection var ansible_shell_executable to /bin/sh 8238 1726882401.23713: variable 'ansible_shell_executable' from source: unknown 8238 1726882401.23716: variable 'ansible_connection' from source: unknown 8238 1726882401.23718: variable 'ansible_module_compression' from source: unknown 8238 1726882401.23721: variable 'ansible_shell_type' from source: unknown 8238 1726882401.23725: variable 'ansible_shell_executable' from source: unknown 8238 1726882401.23728: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882401.23732: variable 'ansible_pipelining' from source: unknown 8238 1726882401.23734: variable 'ansible_timeout' from source: unknown 8238 1726882401.23739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882401.23890: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882401.23898: variable 'omit' from source: magic vars 8238 1726882401.23903: starting attempt loop 8238 1726882401.23906: running the handler 8238 1726882401.23917: _low_level_execute_command(): starting 8238 1726882401.23925: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882401.24463: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882401.24467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882401.24470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882401.24472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882401.24532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882401.24540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882401.24544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882401.24627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882401.26394: stdout chunk (state=3): >>>/root <<< 8238 1726882401.26504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882401.26548: stderr chunk (state=3): >>><<< 8238 1726882401.26551: stdout chunk (state=3): >>><<< 8238 1726882401.26571: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882401.26586: _low_level_execute_command(): starting 8238 1726882401.26589: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302 `" && echo ansible-tmp-1726882401.2657065-9468-140388995656302="` echo /root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302 `" ) && sleep 0' 8238 1726882401.26999: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882401.27013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882401.27019: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882401.27048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882401.27092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882401.27096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882401.27188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882401.29154: stdout chunk (state=3): >>>ansible-tmp-1726882401.2657065-9468-140388995656302=/root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302 <<< 8238 1726882401.29531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882401.29534: stdout chunk (state=3): >>><<< 8238 1726882401.29537: stderr chunk (state=3): >>><<< 8238 1726882401.29540: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882401.2657065-9468-140388995656302=/root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882401.29542: variable 'ansible_module_compression' from source: unknown 8238 1726882401.29545: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 8238 1726882401.29547: variable 'ansible_facts' from source: unknown 8238 1726882401.29588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/AnsiballZ_service_facts.py 8238 1726882401.29771: Sending initial data 8238 1726882401.29786: Sent initial data (160 bytes) 8238 1726882401.30433: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882401.30440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882401.30444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882401.30457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882401.30510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882401.30513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882401.30602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882401.32174: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8238 1726882401.32178: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882401.32271: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882401.32362: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpb9ppcv4d /root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/AnsiballZ_service_facts.py <<< 8238 1726882401.32365: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/AnsiballZ_service_facts.py" <<< 8238 1726882401.32444: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpb9ppcv4d" to remote "/root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/AnsiballZ_service_facts.py" <<< 8238 1726882401.33377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882401.33384: stderr chunk (state=3): >>><<< 8238 1726882401.33387: stdout chunk (state=3): >>><<< 8238 1726882401.33407: done transferring module to remote 8238 1726882401.33416: _low_level_execute_command(): starting 8238 1726882401.33419: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/ /root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/AnsiballZ_service_facts.py && sleep 0' 8238 1726882401.33825: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882401.33856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882401.33861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882401.33863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882401.33911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882401.33914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882401.34003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882401.35929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882401.35933: stdout chunk (state=3): >>><<< 8238 1726882401.35936: stderr chunk (state=3): >>><<< 8238 1726882401.35948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882401.35952: _low_level_execute_command(): starting 8238 1726882401.35954: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/AnsiballZ_service_facts.py && sleep 0' 8238 1726882401.36601: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882401.37032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882401.37036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882403.50763: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind<<< 8238 1726882403.50809: stdout chunk (state=3): >>>.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 8238 1726882403.52398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882403.52551: stdout chunk (state=3): >>><<< 8238 1726882403.52557: stderr chunk (state=3): >>><<< 8238 1726882403.52730: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882403.55094: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882403.55229: _low_level_execute_command(): starting 8238 1726882403.55233: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882401.2657065-9468-140388995656302/ > /dev/null 2>&1 && sleep 0' 8238 1726882403.55898: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882403.55908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882403.55940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.55969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.56048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882403.56108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882403.56354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882403.58137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882403.58182: stderr chunk (state=3): >>><<< 8238 1726882403.58237: stdout chunk (state=3): >>><<< 8238 1726882403.58259: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882403.58267: handler run complete 8238 1726882403.58705: variable 'ansible_facts' from source: unknown 8238 1726882403.59165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882403.60326: variable 'ansible_facts' from source: unknown 8238 1726882403.60691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882403.61157: attempt loop complete, returning result 8238 1726882403.61166: _execute() done 8238 1726882403.61169: dumping result to json 8238 1726882403.61457: done dumping result, returning 8238 1726882403.61465: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-54bc-d334-000000000496] 8238 1726882403.61472: sending task result for task 0affc7ec-ae25-54bc-d334-000000000496 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882403.63857: no more pending results, returning what we have 8238 1726882403.63860: results queue empty 8238 1726882403.63861: checking for any_errors_fatal 8238 1726882403.63864: done checking for any_errors_fatal 8238 1726882403.63865: checking for max_fail_percentage 8238 1726882403.63868: done checking for max_fail_percentage 8238 1726882403.63869: checking to see if all hosts have failed and the running result is not ok 8238 1726882403.63870: done checking to see if all hosts have failed 8238 1726882403.63871: getting the remaining hosts for this loop 8238 1726882403.63872: done getting the remaining hosts for this loop 8238 1726882403.63875: getting the next task for host managed_node3 8238 1726882403.63881: done getting next task for host managed_node3 8238 1726882403.63884: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 8238 1726882403.63890: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882403.63904: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000496 8238 1726882403.63907: WORKER PROCESS EXITING 8238 1726882403.63915: getting variables 8238 1726882403.63916: in VariableManager get_vars() 8238 1726882403.63950: Calling all_inventory to load vars for managed_node3 8238 1726882403.63953: Calling groups_inventory to load vars for managed_node3 8238 1726882403.63955: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882403.63964: Calling all_plugins_play to load vars for managed_node3 8238 1726882403.63967: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882403.63970: Calling groups_plugins_play to load vars for managed_node3 8238 1726882403.67062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882403.70078: done with get_vars() 8238 1726882403.70116: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:23 -0400 (0:00:02.478) 0:00:33.857 ****** 8238 1726882403.70240: entering _queue_task() for managed_node3/package_facts 8238 1726882403.70956: worker is 1 (out of 1 available) 8238 1726882403.70970: exiting _queue_task() for managed_node3/package_facts 8238 1726882403.70982: done queuing things up, now waiting for results queue to drain 8238 1726882403.70983: waiting for pending results... 8238 1726882403.71444: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 8238 1726882403.71450: in run() - task 0affc7ec-ae25-54bc-d334-000000000497 8238 1726882403.71454: variable 'ansible_search_path' from source: unknown 8238 1726882403.71456: variable 'ansible_search_path' from source: unknown 8238 1726882403.71542: calling self._execute() 8238 1726882403.71604: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882403.71609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882403.71621: variable 'omit' from source: magic vars 8238 1726882403.72048: variable 'ansible_distribution_major_version' from source: facts 8238 1726882403.72063: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882403.72070: variable 'omit' from source: magic vars 8238 1726882403.72171: variable 'omit' from source: magic vars 8238 1726882403.72270: variable 'omit' from source: magic vars 8238 1726882403.72314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882403.72358: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882403.72382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882403.72402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882403.72414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882403.72485: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882403.72489: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882403.72492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882403.72741: Set connection var ansible_connection to ssh 8238 1726882403.72745: Set connection var ansible_shell_type to sh 8238 1726882403.72748: Set connection var ansible_pipelining to False 8238 1726882403.72751: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882403.72753: Set connection var ansible_timeout to 10 8238 1726882403.72756: Set connection var ansible_shell_executable to /bin/sh 8238 1726882403.72758: variable 'ansible_shell_executable' from source: unknown 8238 1726882403.72760: variable 'ansible_connection' from source: unknown 8238 1726882403.72763: variable 'ansible_module_compression' from source: unknown 8238 1726882403.72765: variable 'ansible_shell_type' from source: unknown 8238 1726882403.72767: variable 'ansible_shell_executable' from source: unknown 8238 1726882403.72769: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882403.72771: variable 'ansible_pipelining' from source: unknown 8238 1726882403.72773: variable 'ansible_timeout' from source: unknown 8238 1726882403.72775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882403.73017: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882403.73063: variable 'omit' from source: magic vars 8238 1726882403.73067: starting attempt loop 8238 1726882403.73070: running the handler 8238 1726882403.73073: _low_level_execute_command(): starting 8238 1726882403.73076: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882403.73869: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882403.73945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.74065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882403.74073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882403.74195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882403.75901: stdout chunk (state=3): >>>/root <<< 8238 1726882403.76059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882403.76094: stderr chunk (state=3): >>><<< 8238 1726882403.76098: stdout chunk (state=3): >>><<< 8238 1726882403.76118: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882403.76133: _low_level_execute_command(): starting 8238 1726882403.76139: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715 `" && echo ansible-tmp-1726882403.7611895-9568-243764992784715="` echo /root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715 `" ) && sleep 0' 8238 1726882403.76839: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.76870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882403.76887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882403.76901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882403.77024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882403.78980: stdout chunk (state=3): >>>ansible-tmp-1726882403.7611895-9568-243764992784715=/root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715 <<< 8238 1726882403.79097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882403.79145: stderr chunk (state=3): >>><<< 8238 1726882403.79151: stdout chunk (state=3): >>><<< 8238 1726882403.79169: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882403.7611895-9568-243764992784715=/root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882403.79212: variable 'ansible_module_compression' from source: unknown 8238 1726882403.79252: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 8238 1726882403.79306: variable 'ansible_facts' from source: unknown 8238 1726882403.79425: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/AnsiballZ_package_facts.py 8238 1726882403.79545: Sending initial data 8238 1726882403.79549: Sent initial data (160 bytes) 8238 1726882403.79997: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882403.80001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882403.80004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.80006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882403.80010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.80055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882403.80059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882403.80149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882403.81792: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882403.81865: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882403.81972: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp4vaz0mkb /root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/AnsiballZ_package_facts.py <<< 8238 1726882403.81976: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/AnsiballZ_package_facts.py" <<< 8238 1726882403.82060: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp4vaz0mkb" to remote "/root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/AnsiballZ_package_facts.py" <<< 8238 1726882403.83575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882403.83595: stderr chunk (state=3): >>><<< 8238 1726882403.83629: stdout chunk (state=3): >>><<< 8238 1726882403.83633: done transferring module to remote 8238 1726882403.83639: _low_level_execute_command(): starting 8238 1726882403.83642: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/ /root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/AnsiballZ_package_facts.py && sleep 0' 8238 1726882403.84094: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882403.84097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.84101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882403.84105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.84150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882403.84153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882403.84262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882403.86099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882403.86143: stderr chunk (state=3): >>><<< 8238 1726882403.86147: stdout chunk (state=3): >>><<< 8238 1726882403.86161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882403.86164: _low_level_execute_command(): starting 8238 1726882403.86167: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/AnsiballZ_package_facts.py && sleep 0' 8238 1726882403.86585: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882403.86589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882403.86620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.86626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882403.86629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882403.86680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882403.86687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882403.86689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882403.86776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882404.48718: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 8238 1726882404.48736: stdout chunk (state=3): >>>me": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 8238 1726882404.48874: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1",<<< 8238 1726882404.48929: stdout chunk (state=3): >>> "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rp<<< 8238 1726882404.48936: stdout chunk (state=3): >>>m"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch":<<< 8238 1726882404.48963: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 8238 1726882404.50771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882404.50893: stderr chunk (state=3): >>><<< 8238 1726882404.50897: stdout chunk (state=3): >>><<< 8238 1726882404.51037: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882404.54353: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882404.54381: _low_level_execute_command(): starting 8238 1726882404.54400: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882403.7611895-9568-243764992784715/ > /dev/null 2>&1 && sleep 0' 8238 1726882404.55146: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882404.55215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882404.55238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882404.55292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882404.55387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882404.57417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882404.57424: stdout chunk (state=3): >>><<< 8238 1726882404.57427: stderr chunk (state=3): >>><<< 8238 1726882404.57454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882404.57634: handler run complete 8238 1726882404.58787: variable 'ansible_facts' from source: unknown 8238 1726882404.59365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882404.61940: variable 'ansible_facts' from source: unknown 8238 1726882404.62510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882404.63486: attempt loop complete, returning result 8238 1726882404.63509: _execute() done 8238 1726882404.63518: dumping result to json 8238 1726882404.63753: done dumping result, returning 8238 1726882404.63768: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-54bc-d334-000000000497] 8238 1726882404.63778: sending task result for task 0affc7ec-ae25-54bc-d334-000000000497 8238 1726882404.67008: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000497 8238 1726882404.67012: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882404.67171: no more pending results, returning what we have 8238 1726882404.67174: results queue empty 8238 1726882404.67175: checking for any_errors_fatal 8238 1726882404.67181: done checking for any_errors_fatal 8238 1726882404.67181: checking for max_fail_percentage 8238 1726882404.67183: done checking for max_fail_percentage 8238 1726882404.67184: checking to see if all hosts have failed and the running result is not ok 8238 1726882404.67185: done checking to see if all hosts have failed 8238 1726882404.67185: getting the remaining hosts for this loop 8238 1726882404.67187: done getting the remaining hosts for this loop 8238 1726882404.67190: getting the next task for host managed_node3 8238 1726882404.67198: done getting next task for host managed_node3 8238 1726882404.67202: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 8238 1726882404.67206: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882404.67218: getting variables 8238 1726882404.67220: in VariableManager get_vars() 8238 1726882404.67263: Calling all_inventory to load vars for managed_node3 8238 1726882404.67266: Calling groups_inventory to load vars for managed_node3 8238 1726882404.67269: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882404.67278: Calling all_plugins_play to load vars for managed_node3 8238 1726882404.67281: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882404.67284: Calling groups_plugins_play to load vars for managed_node3 8238 1726882404.68918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882404.71238: done with get_vars() 8238 1726882404.71271: done getting variables 8238 1726882404.71349: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:24 -0400 (0:00:01.011) 0:00:34.869 ****** 8238 1726882404.71397: entering _queue_task() for managed_node3/debug 8238 1726882404.72047: worker is 1 (out of 1 available) 8238 1726882404.72061: exiting _queue_task() for managed_node3/debug 8238 1726882404.72072: done queuing things up, now waiting for results queue to drain 8238 1726882404.72074: waiting for pending results... 8238 1726882404.72208: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 8238 1726882404.72315: in run() - task 0affc7ec-ae25-54bc-d334-00000000007d 8238 1726882404.72339: variable 'ansible_search_path' from source: unknown 8238 1726882404.72347: variable 'ansible_search_path' from source: unknown 8238 1726882404.72392: calling self._execute() 8238 1726882404.72521: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882404.72528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882404.72628: variable 'omit' from source: magic vars 8238 1726882404.72992: variable 'ansible_distribution_major_version' from source: facts 8238 1726882404.73009: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882404.73019: variable 'omit' from source: magic vars 8238 1726882404.73112: variable 'omit' from source: magic vars 8238 1726882404.73234: variable 'network_provider' from source: set_fact 8238 1726882404.73260: variable 'omit' from source: magic vars 8238 1726882404.73317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882404.73366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882404.73395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882404.73425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882404.73442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882404.73483: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882404.73491: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882404.73504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882404.73719: Set connection var ansible_connection to ssh 8238 1726882404.73726: Set connection var ansible_shell_type to sh 8238 1726882404.73729: Set connection var ansible_pipelining to False 8238 1726882404.73732: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882404.73734: Set connection var ansible_timeout to 10 8238 1726882404.73736: Set connection var ansible_shell_executable to /bin/sh 8238 1726882404.73738: variable 'ansible_shell_executable' from source: unknown 8238 1726882404.73740: variable 'ansible_connection' from source: unknown 8238 1726882404.73742: variable 'ansible_module_compression' from source: unknown 8238 1726882404.73744: variable 'ansible_shell_type' from source: unknown 8238 1726882404.73746: variable 'ansible_shell_executable' from source: unknown 8238 1726882404.73748: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882404.73750: variable 'ansible_pipelining' from source: unknown 8238 1726882404.73752: variable 'ansible_timeout' from source: unknown 8238 1726882404.73757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882404.73981: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882404.73985: variable 'omit' from source: magic vars 8238 1726882404.73987: starting attempt loop 8238 1726882404.73990: running the handler 8238 1726882404.74012: handler run complete 8238 1726882404.74048: attempt loop complete, returning result 8238 1726882404.74051: _execute() done 8238 1726882404.74054: dumping result to json 8238 1726882404.74089: done dumping result, returning 8238 1726882404.74092: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-54bc-d334-00000000007d] 8238 1726882404.74094: sending task result for task 0affc7ec-ae25-54bc-d334-00000000007d 8238 1726882404.74430: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000007d 8238 1726882404.74434: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 8238 1726882404.74495: no more pending results, returning what we have 8238 1726882404.74499: results queue empty 8238 1726882404.74500: checking for any_errors_fatal 8238 1726882404.74507: done checking for any_errors_fatal 8238 1726882404.74507: checking for max_fail_percentage 8238 1726882404.74509: done checking for max_fail_percentage 8238 1726882404.74510: checking to see if all hosts have failed and the running result is not ok 8238 1726882404.74511: done checking to see if all hosts have failed 8238 1726882404.74512: getting the remaining hosts for this loop 8238 1726882404.74513: done getting the remaining hosts for this loop 8238 1726882404.74517: getting the next task for host managed_node3 8238 1726882404.74526: done getting next task for host managed_node3 8238 1726882404.74530: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8238 1726882404.74534: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882404.74546: getting variables 8238 1726882404.74547: in VariableManager get_vars() 8238 1726882404.74591: Calling all_inventory to load vars for managed_node3 8238 1726882404.74594: Calling groups_inventory to load vars for managed_node3 8238 1726882404.74597: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882404.74607: Calling all_plugins_play to load vars for managed_node3 8238 1726882404.74610: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882404.74613: Calling groups_plugins_play to load vars for managed_node3 8238 1726882404.76452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882404.78621: done with get_vars() 8238 1726882404.78665: done getting variables 8238 1726882404.78728: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:24 -0400 (0:00:00.073) 0:00:34.942 ****** 8238 1726882404.78776: entering _queue_task() for managed_node3/fail 8238 1726882404.79165: worker is 1 (out of 1 available) 8238 1726882404.79294: exiting _queue_task() for managed_node3/fail 8238 1726882404.79307: done queuing things up, now waiting for results queue to drain 8238 1726882404.79308: waiting for pending results... 8238 1726882404.79635: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8238 1726882404.79700: in run() - task 0affc7ec-ae25-54bc-d334-00000000007e 8238 1726882404.79724: variable 'ansible_search_path' from source: unknown 8238 1726882404.79735: variable 'ansible_search_path' from source: unknown 8238 1726882404.79787: calling self._execute() 8238 1726882404.79896: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882404.79908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882404.79924: variable 'omit' from source: magic vars 8238 1726882404.80384: variable 'ansible_distribution_major_version' from source: facts 8238 1726882404.80388: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882404.80528: variable 'network_state' from source: role '' defaults 8238 1726882404.80544: Evaluated conditional (network_state != {}): False 8238 1726882404.80551: when evaluation is False, skipping this task 8238 1726882404.80601: _execute() done 8238 1726882404.80604: dumping result to json 8238 1726882404.80607: done dumping result, returning 8238 1726882404.80610: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-54bc-d334-00000000007e] 8238 1726882404.80615: sending task result for task 0affc7ec-ae25-54bc-d334-00000000007e 8238 1726882404.80931: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000007e 8238 1726882404.80935: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882404.80977: no more pending results, returning what we have 8238 1726882404.80981: results queue empty 8238 1726882404.80982: checking for any_errors_fatal 8238 1726882404.80988: done checking for any_errors_fatal 8238 1726882404.80989: checking for max_fail_percentage 8238 1726882404.80990: done checking for max_fail_percentage 8238 1726882404.80991: checking to see if all hosts have failed and the running result is not ok 8238 1726882404.80992: done checking to see if all hosts have failed 8238 1726882404.80993: getting the remaining hosts for this loop 8238 1726882404.80995: done getting the remaining hosts for this loop 8238 1726882404.80998: getting the next task for host managed_node3 8238 1726882404.81004: done getting next task for host managed_node3 8238 1726882404.81008: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8238 1726882404.81012: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882404.81031: getting variables 8238 1726882404.81033: in VariableManager get_vars() 8238 1726882404.81072: Calling all_inventory to load vars for managed_node3 8238 1726882404.81075: Calling groups_inventory to load vars for managed_node3 8238 1726882404.81077: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882404.81087: Calling all_plugins_play to load vars for managed_node3 8238 1726882404.81090: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882404.81093: Calling groups_plugins_play to load vars for managed_node3 8238 1726882404.83047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882404.85229: done with get_vars() 8238 1726882404.85271: done getting variables 8238 1726882404.85344: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:24 -0400 (0:00:00.066) 0:00:35.008 ****** 8238 1726882404.85391: entering _queue_task() for managed_node3/fail 8238 1726882404.85907: worker is 1 (out of 1 available) 8238 1726882404.85920: exiting _queue_task() for managed_node3/fail 8238 1726882404.85932: done queuing things up, now waiting for results queue to drain 8238 1726882404.85934: waiting for pending results... 8238 1726882404.86174: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8238 1726882404.86349: in run() - task 0affc7ec-ae25-54bc-d334-00000000007f 8238 1726882404.86361: variable 'ansible_search_path' from source: unknown 8238 1726882404.86377: variable 'ansible_search_path' from source: unknown 8238 1726882404.86460: calling self._execute() 8238 1726882404.86534: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882404.86548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882404.86570: variable 'omit' from source: magic vars 8238 1726882404.86999: variable 'ansible_distribution_major_version' from source: facts 8238 1726882404.87018: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882404.87163: variable 'network_state' from source: role '' defaults 8238 1726882404.87182: Evaluated conditional (network_state != {}): False 8238 1726882404.87189: when evaluation is False, skipping this task 8238 1726882404.87196: _execute() done 8238 1726882404.87203: dumping result to json 8238 1726882404.87219: done dumping result, returning 8238 1726882404.87232: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-54bc-d334-00000000007f] 8238 1726882404.87324: sending task result for task 0affc7ec-ae25-54bc-d334-00000000007f 8238 1726882404.87404: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000007f 8238 1726882404.87407: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882404.87479: no more pending results, returning what we have 8238 1726882404.87484: results queue empty 8238 1726882404.87485: checking for any_errors_fatal 8238 1726882404.87496: done checking for any_errors_fatal 8238 1726882404.87497: checking for max_fail_percentage 8238 1726882404.87499: done checking for max_fail_percentage 8238 1726882404.87500: checking to see if all hosts have failed and the running result is not ok 8238 1726882404.87501: done checking to see if all hosts have failed 8238 1726882404.87502: getting the remaining hosts for this loop 8238 1726882404.87504: done getting the remaining hosts for this loop 8238 1726882404.87508: getting the next task for host managed_node3 8238 1726882404.87518: done getting next task for host managed_node3 8238 1726882404.87525: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8238 1726882404.87530: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882404.87650: getting variables 8238 1726882404.87653: in VariableManager get_vars() 8238 1726882404.87704: Calling all_inventory to load vars for managed_node3 8238 1726882404.87708: Calling groups_inventory to load vars for managed_node3 8238 1726882404.87710: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882404.87840: Calling all_plugins_play to load vars for managed_node3 8238 1726882404.87845: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882404.87849: Calling groups_plugins_play to load vars for managed_node3 8238 1726882404.89644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882404.91975: done with get_vars() 8238 1726882404.92001: done getting variables 8238 1726882404.92074: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:24 -0400 (0:00:00.067) 0:00:35.076 ****** 8238 1726882404.92110: entering _queue_task() for managed_node3/fail 8238 1726882404.92729: worker is 1 (out of 1 available) 8238 1726882404.92741: exiting _queue_task() for managed_node3/fail 8238 1726882404.92752: done queuing things up, now waiting for results queue to drain 8238 1726882404.92756: waiting for pending results... 8238 1726882404.92891: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8238 1726882404.93096: in run() - task 0affc7ec-ae25-54bc-d334-000000000080 8238 1726882404.93100: variable 'ansible_search_path' from source: unknown 8238 1726882404.93103: variable 'ansible_search_path' from source: unknown 8238 1726882404.93120: calling self._execute() 8238 1726882404.93243: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882404.93262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882404.93280: variable 'omit' from source: magic vars 8238 1726882404.93752: variable 'ansible_distribution_major_version' from source: facts 8238 1726882404.93763: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882404.93998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882404.96584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882404.96693: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882404.96711: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882404.96758: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882404.96792: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882404.96899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882404.97017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882404.97020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882404.97031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882404.97053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882404.97171: variable 'ansible_distribution_major_version' from source: facts 8238 1726882404.97192: Evaluated conditional (ansible_distribution_major_version | int > 9): True 8238 1726882404.97333: variable 'ansible_distribution' from source: facts 8238 1726882404.97343: variable '__network_rh_distros' from source: role '' defaults 8238 1726882404.97367: Evaluated conditional (ansible_distribution in __network_rh_distros): False 8238 1726882404.97376: when evaluation is False, skipping this task 8238 1726882404.97384: _execute() done 8238 1726882404.97427: dumping result to json 8238 1726882404.97431: done dumping result, returning 8238 1726882404.97434: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-54bc-d334-000000000080] 8238 1726882404.97437: sending task result for task 0affc7ec-ae25-54bc-d334-000000000080 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 8238 1726882404.97775: no more pending results, returning what we have 8238 1726882404.97779: results queue empty 8238 1726882404.97780: checking for any_errors_fatal 8238 1726882404.97786: done checking for any_errors_fatal 8238 1726882404.97788: checking for max_fail_percentage 8238 1726882404.97790: done checking for max_fail_percentage 8238 1726882404.97791: checking to see if all hosts have failed and the running result is not ok 8238 1726882404.97792: done checking to see if all hosts have failed 8238 1726882404.97793: getting the remaining hosts for this loop 8238 1726882404.97795: done getting the remaining hosts for this loop 8238 1726882404.97799: getting the next task for host managed_node3 8238 1726882404.97807: done getting next task for host managed_node3 8238 1726882404.97812: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8238 1726882404.97816: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882404.97840: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000080 8238 1726882404.97843: WORKER PROCESS EXITING 8238 1726882404.97851: getting variables 8238 1726882404.97852: in VariableManager get_vars() 8238 1726882404.97898: Calling all_inventory to load vars for managed_node3 8238 1726882404.97901: Calling groups_inventory to load vars for managed_node3 8238 1726882404.97903: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882404.97913: Calling all_plugins_play to load vars for managed_node3 8238 1726882404.97916: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882404.97918: Calling groups_plugins_play to load vars for managed_node3 8238 1726882404.99864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882405.02103: done with get_vars() 8238 1726882405.02135: done getting variables 8238 1726882405.02207: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:25 -0400 (0:00:00.101) 0:00:35.177 ****** 8238 1726882405.02246: entering _queue_task() for managed_node3/dnf 8238 1726882405.02847: worker is 1 (out of 1 available) 8238 1726882405.02862: exiting _queue_task() for managed_node3/dnf 8238 1726882405.02874: done queuing things up, now waiting for results queue to drain 8238 1726882405.02875: waiting for pending results... 8238 1726882405.03017: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8238 1726882405.03201: in run() - task 0affc7ec-ae25-54bc-d334-000000000081 8238 1726882405.03234: variable 'ansible_search_path' from source: unknown 8238 1726882405.03244: variable 'ansible_search_path' from source: unknown 8238 1726882405.03292: calling self._execute() 8238 1726882405.03406: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.03419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.03442: variable 'omit' from source: magic vars 8238 1726882405.03885: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.03904: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882405.04147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882405.06782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882405.06875: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882405.06931: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882405.06974: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882405.07023: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882405.07106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.07327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.07331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.07334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.07336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.07381: variable 'ansible_distribution' from source: facts 8238 1726882405.07392: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.07404: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 8238 1726882405.07538: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882405.07702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.07735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.07767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.07820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.07842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.07902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.07934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.07966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.08020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.08040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.08089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.08219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.08224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.08227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.08229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.08399: variable 'network_connections' from source: task vars 8238 1726882405.08417: variable 'port2_profile' from source: play vars 8238 1726882405.08494: variable 'port2_profile' from source: play vars 8238 1726882405.08508: variable 'port1_profile' from source: play vars 8238 1726882405.08585: variable 'port1_profile' from source: play vars 8238 1726882405.08598: variable 'controller_profile' from source: play vars 8238 1726882405.08674: variable 'controller_profile' from source: play vars 8238 1726882405.08764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882405.09344: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882405.09390: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882405.09433: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882405.09472: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882405.09529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882405.09639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882405.09643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.09645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882405.09697: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882405.09951: variable 'network_connections' from source: task vars 8238 1726882405.09975: variable 'port2_profile' from source: play vars 8238 1726882405.10052: variable 'port2_profile' from source: play vars 8238 1726882405.10075: variable 'port1_profile' from source: play vars 8238 1726882405.10147: variable 'port1_profile' from source: play vars 8238 1726882405.10165: variable 'controller_profile' from source: play vars 8238 1726882405.10243: variable 'controller_profile' from source: play vars 8238 1726882405.10277: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8238 1726882405.10291: when evaluation is False, skipping this task 8238 1726882405.10300: _execute() done 8238 1726882405.10332: dumping result to json 8238 1726882405.10335: done dumping result, returning 8238 1726882405.10338: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-54bc-d334-000000000081] 8238 1726882405.10347: sending task result for task 0affc7ec-ae25-54bc-d334-000000000081 8238 1726882405.10632: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000081 8238 1726882405.10636: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8238 1726882405.10699: no more pending results, returning what we have 8238 1726882405.10703: results queue empty 8238 1726882405.10704: checking for any_errors_fatal 8238 1726882405.10712: done checking for any_errors_fatal 8238 1726882405.10713: checking for max_fail_percentage 8238 1726882405.10714: done checking for max_fail_percentage 8238 1726882405.10715: checking to see if all hosts have failed and the running result is not ok 8238 1726882405.10716: done checking to see if all hosts have failed 8238 1726882405.10717: getting the remaining hosts for this loop 8238 1726882405.10719: done getting the remaining hosts for this loop 8238 1726882405.10728: getting the next task for host managed_node3 8238 1726882405.10928: done getting next task for host managed_node3 8238 1726882405.10933: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8238 1726882405.10938: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882405.10958: getting variables 8238 1726882405.10960: in VariableManager get_vars() 8238 1726882405.11000: Calling all_inventory to load vars for managed_node3 8238 1726882405.11003: Calling groups_inventory to load vars for managed_node3 8238 1726882405.11006: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882405.11016: Calling all_plugins_play to load vars for managed_node3 8238 1726882405.11019: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882405.11028: Calling groups_plugins_play to load vars for managed_node3 8238 1726882405.13017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882405.15205: done with get_vars() 8238 1726882405.15238: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8238 1726882405.15330: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:25 -0400 (0:00:00.131) 0:00:35.308 ****** 8238 1726882405.15374: entering _queue_task() for managed_node3/yum 8238 1726882405.15848: worker is 1 (out of 1 available) 8238 1726882405.15865: exiting _queue_task() for managed_node3/yum 8238 1726882405.15878: done queuing things up, now waiting for results queue to drain 8238 1726882405.15880: waiting for pending results... 8238 1726882405.16145: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8238 1726882405.16250: in run() - task 0affc7ec-ae25-54bc-d334-000000000082 8238 1726882405.16265: variable 'ansible_search_path' from source: unknown 8238 1726882405.16269: variable 'ansible_search_path' from source: unknown 8238 1726882405.16306: calling self._execute() 8238 1726882405.16385: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.16390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.16401: variable 'omit' from source: magic vars 8238 1726882405.16699: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.16709: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882405.16850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882405.18893: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882405.18952: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882405.18989: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882405.19014: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882405.19043: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882405.19113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.19135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.19153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.19186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.19198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.19275: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.19289: Evaluated conditional (ansible_distribution_major_version | int < 8): False 8238 1726882405.19292: when evaluation is False, skipping this task 8238 1726882405.19295: _execute() done 8238 1726882405.19299: dumping result to json 8238 1726882405.19302: done dumping result, returning 8238 1726882405.19312: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-54bc-d334-000000000082] 8238 1726882405.19316: sending task result for task 0affc7ec-ae25-54bc-d334-000000000082 8238 1726882405.19417: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000082 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 8238 1726882405.19468: no more pending results, returning what we have 8238 1726882405.19472: results queue empty 8238 1726882405.19473: checking for any_errors_fatal 8238 1726882405.19479: done checking for any_errors_fatal 8238 1726882405.19479: checking for max_fail_percentage 8238 1726882405.19481: done checking for max_fail_percentage 8238 1726882405.19482: checking to see if all hosts have failed and the running result is not ok 8238 1726882405.19483: done checking to see if all hosts have failed 8238 1726882405.19484: getting the remaining hosts for this loop 8238 1726882405.19485: done getting the remaining hosts for this loop 8238 1726882405.19489: getting the next task for host managed_node3 8238 1726882405.19498: done getting next task for host managed_node3 8238 1726882405.19502: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8238 1726882405.19506: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882405.19528: getting variables 8238 1726882405.19529: in VariableManager get_vars() 8238 1726882405.19569: Calling all_inventory to load vars for managed_node3 8238 1726882405.19572: Calling groups_inventory to load vars for managed_node3 8238 1726882405.19574: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882405.19584: Calling all_plugins_play to load vars for managed_node3 8238 1726882405.19586: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882405.19589: Calling groups_plugins_play to load vars for managed_node3 8238 1726882405.20139: WORKER PROCESS EXITING 8238 1726882405.20573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882405.22462: done with get_vars() 8238 1726882405.22480: done getting variables 8238 1726882405.22530: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:25 -0400 (0:00:00.071) 0:00:35.380 ****** 8238 1726882405.22557: entering _queue_task() for managed_node3/fail 8238 1726882405.22820: worker is 1 (out of 1 available) 8238 1726882405.22837: exiting _queue_task() for managed_node3/fail 8238 1726882405.22851: done queuing things up, now waiting for results queue to drain 8238 1726882405.22852: waiting for pending results... 8238 1726882405.23039: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8238 1726882405.23149: in run() - task 0affc7ec-ae25-54bc-d334-000000000083 8238 1726882405.23163: variable 'ansible_search_path' from source: unknown 8238 1726882405.23167: variable 'ansible_search_path' from source: unknown 8238 1726882405.23200: calling self._execute() 8238 1726882405.23274: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.23278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.23288: variable 'omit' from source: magic vars 8238 1726882405.23580: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.23590: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882405.23682: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882405.23830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882405.30549: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882405.30593: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882405.30624: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882405.30647: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882405.30678: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882405.30738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.30758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.30779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.30806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.30818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.30861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.30878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.30895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.30923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.30934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.30971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.30988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.31005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.31033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.31044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.31168: variable 'network_connections' from source: task vars 8238 1726882405.31173: variable 'port2_profile' from source: play vars 8238 1726882405.31226: variable 'port2_profile' from source: play vars 8238 1726882405.31233: variable 'port1_profile' from source: play vars 8238 1726882405.31283: variable 'port1_profile' from source: play vars 8238 1726882405.31290: variable 'controller_profile' from source: play vars 8238 1726882405.31334: variable 'controller_profile' from source: play vars 8238 1726882405.31387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882405.31493: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882405.31529: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882405.31552: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882405.31574: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882405.31612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882405.31639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882405.31660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.31678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882405.31713: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882405.31871: variable 'network_connections' from source: task vars 8238 1726882405.31874: variable 'port2_profile' from source: play vars 8238 1726882405.31918: variable 'port2_profile' from source: play vars 8238 1726882405.31931: variable 'port1_profile' from source: play vars 8238 1726882405.31973: variable 'port1_profile' from source: play vars 8238 1726882405.31979: variable 'controller_profile' from source: play vars 8238 1726882405.32024: variable 'controller_profile' from source: play vars 8238 1726882405.32047: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8238 1726882405.32061: when evaluation is False, skipping this task 8238 1726882405.32064: _execute() done 8238 1726882405.32067: dumping result to json 8238 1726882405.32069: done dumping result, returning 8238 1726882405.32071: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-54bc-d334-000000000083] 8238 1726882405.32074: sending task result for task 0affc7ec-ae25-54bc-d334-000000000083 8238 1726882405.32170: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000083 8238 1726882405.32173: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8238 1726882405.32230: no more pending results, returning what we have 8238 1726882405.32234: results queue empty 8238 1726882405.32235: checking for any_errors_fatal 8238 1726882405.32240: done checking for any_errors_fatal 8238 1726882405.32241: checking for max_fail_percentage 8238 1726882405.32242: done checking for max_fail_percentage 8238 1726882405.32243: checking to see if all hosts have failed and the running result is not ok 8238 1726882405.32244: done checking to see if all hosts have failed 8238 1726882405.32245: getting the remaining hosts for this loop 8238 1726882405.32246: done getting the remaining hosts for this loop 8238 1726882405.32251: getting the next task for host managed_node3 8238 1726882405.32261: done getting next task for host managed_node3 8238 1726882405.32265: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 8238 1726882405.32269: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882405.32287: getting variables 8238 1726882405.32289: in VariableManager get_vars() 8238 1726882405.32330: Calling all_inventory to load vars for managed_node3 8238 1726882405.32332: Calling groups_inventory to load vars for managed_node3 8238 1726882405.32335: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882405.32344: Calling all_plugins_play to load vars for managed_node3 8238 1726882405.32346: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882405.32349: Calling groups_plugins_play to load vars for managed_node3 8238 1726882405.36884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882405.38025: done with get_vars() 8238 1726882405.38044: done getting variables 8238 1726882405.38084: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:25 -0400 (0:00:00.155) 0:00:35.536 ****** 8238 1726882405.38108: entering _queue_task() for managed_node3/package 8238 1726882405.38398: worker is 1 (out of 1 available) 8238 1726882405.38413: exiting _queue_task() for managed_node3/package 8238 1726882405.38427: done queuing things up, now waiting for results queue to drain 8238 1726882405.38430: waiting for pending results... 8238 1726882405.38615: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 8238 1726882405.38730: in run() - task 0affc7ec-ae25-54bc-d334-000000000084 8238 1726882405.38749: variable 'ansible_search_path' from source: unknown 8238 1726882405.38753: variable 'ansible_search_path' from source: unknown 8238 1726882405.38781: calling self._execute() 8238 1726882405.38861: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.38865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.38870: variable 'omit' from source: magic vars 8238 1726882405.39174: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.39185: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882405.39332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882405.39534: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882405.39568: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882405.39631: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882405.39727: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882405.39746: variable 'network_packages' from source: role '' defaults 8238 1726882405.39827: variable '__network_provider_setup' from source: role '' defaults 8238 1726882405.39836: variable '__network_service_name_default_nm' from source: role '' defaults 8238 1726882405.39889: variable '__network_service_name_default_nm' from source: role '' defaults 8238 1726882405.39896: variable '__network_packages_default_nm' from source: role '' defaults 8238 1726882405.39944: variable '__network_packages_default_nm' from source: role '' defaults 8238 1726882405.40082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882405.41528: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882405.41575: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882405.41605: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882405.41632: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882405.41776: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882405.41848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.41872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.41891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.41922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.41936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.41975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.41992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.42011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.42046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.42055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.42214: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8238 1726882405.42299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.42317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.42336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.42371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.42382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.42448: variable 'ansible_python' from source: facts 8238 1726882405.42474: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8238 1726882405.42534: variable '__network_wpa_supplicant_required' from source: role '' defaults 8238 1726882405.42596: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8238 1726882405.42690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.42710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.42730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.42757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.42770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.42811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.42832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.42851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.42880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.42891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.42998: variable 'network_connections' from source: task vars 8238 1726882405.43006: variable 'port2_profile' from source: play vars 8238 1726882405.43084: variable 'port2_profile' from source: play vars 8238 1726882405.43093: variable 'port1_profile' from source: play vars 8238 1726882405.43171: variable 'port1_profile' from source: play vars 8238 1726882405.43179: variable 'controller_profile' from source: play vars 8238 1726882405.43261: variable 'controller_profile' from source: play vars 8238 1726882405.43309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882405.43335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882405.43361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.43383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882405.43421: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882405.43619: variable 'network_connections' from source: task vars 8238 1726882405.43624: variable 'port2_profile' from source: play vars 8238 1726882405.43703: variable 'port2_profile' from source: play vars 8238 1726882405.43711: variable 'port1_profile' from source: play vars 8238 1726882405.43788: variable 'port1_profile' from source: play vars 8238 1726882405.43795: variable 'controller_profile' from source: play vars 8238 1726882405.43873: variable 'controller_profile' from source: play vars 8238 1726882405.43899: variable '__network_packages_default_wireless' from source: role '' defaults 8238 1726882405.43959: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882405.44173: variable 'network_connections' from source: task vars 8238 1726882405.44177: variable 'port2_profile' from source: play vars 8238 1726882405.44226: variable 'port2_profile' from source: play vars 8238 1726882405.44233: variable 'port1_profile' from source: play vars 8238 1726882405.44282: variable 'port1_profile' from source: play vars 8238 1726882405.44288: variable 'controller_profile' from source: play vars 8238 1726882405.44338: variable 'controller_profile' from source: play vars 8238 1726882405.44359: variable '__network_packages_default_team' from source: role '' defaults 8238 1726882405.44413: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882405.44630: variable 'network_connections' from source: task vars 8238 1726882405.44635: variable 'port2_profile' from source: play vars 8238 1726882405.44684: variable 'port2_profile' from source: play vars 8238 1726882405.44691: variable 'port1_profile' from source: play vars 8238 1726882405.44739: variable 'port1_profile' from source: play vars 8238 1726882405.44747: variable 'controller_profile' from source: play vars 8238 1726882405.44795: variable 'controller_profile' from source: play vars 8238 1726882405.44837: variable '__network_service_name_default_initscripts' from source: role '' defaults 8238 1726882405.44884: variable '__network_service_name_default_initscripts' from source: role '' defaults 8238 1726882405.44888: variable '__network_packages_default_initscripts' from source: role '' defaults 8238 1726882405.44937: variable '__network_packages_default_initscripts' from source: role '' defaults 8238 1726882405.45085: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8238 1726882405.45430: variable 'network_connections' from source: task vars 8238 1726882405.45435: variable 'port2_profile' from source: play vars 8238 1726882405.45481: variable 'port2_profile' from source: play vars 8238 1726882405.45488: variable 'port1_profile' from source: play vars 8238 1726882405.45535: variable 'port1_profile' from source: play vars 8238 1726882405.45541: variable 'controller_profile' from source: play vars 8238 1726882405.45588: variable 'controller_profile' from source: play vars 8238 1726882405.45595: variable 'ansible_distribution' from source: facts 8238 1726882405.45598: variable '__network_rh_distros' from source: role '' defaults 8238 1726882405.45604: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.45617: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8238 1726882405.45740: variable 'ansible_distribution' from source: facts 8238 1726882405.45743: variable '__network_rh_distros' from source: role '' defaults 8238 1726882405.45746: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.45753: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8238 1726882405.45869: variable 'ansible_distribution' from source: facts 8238 1726882405.45875: variable '__network_rh_distros' from source: role '' defaults 8238 1726882405.45878: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.45903: variable 'network_provider' from source: set_fact 8238 1726882405.45914: variable 'ansible_facts' from source: unknown 8238 1726882405.46475: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 8238 1726882405.46479: when evaluation is False, skipping this task 8238 1726882405.46482: _execute() done 8238 1726882405.46484: dumping result to json 8238 1726882405.46487: done dumping result, returning 8238 1726882405.46496: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-54bc-d334-000000000084] 8238 1726882405.46499: sending task result for task 0affc7ec-ae25-54bc-d334-000000000084 8238 1726882405.46600: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000084 8238 1726882405.46603: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 8238 1726882405.46659: no more pending results, returning what we have 8238 1726882405.46663: results queue empty 8238 1726882405.46664: checking for any_errors_fatal 8238 1726882405.46673: done checking for any_errors_fatal 8238 1726882405.46674: checking for max_fail_percentage 8238 1726882405.46675: done checking for max_fail_percentage 8238 1726882405.46676: checking to see if all hosts have failed and the running result is not ok 8238 1726882405.46678: done checking to see if all hosts have failed 8238 1726882405.46678: getting the remaining hosts for this loop 8238 1726882405.46680: done getting the remaining hosts for this loop 8238 1726882405.46688: getting the next task for host managed_node3 8238 1726882405.46697: done getting next task for host managed_node3 8238 1726882405.46701: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8238 1726882405.46704: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882405.46740: getting variables 8238 1726882405.46742: in VariableManager get_vars() 8238 1726882405.46782: Calling all_inventory to load vars for managed_node3 8238 1726882405.46785: Calling groups_inventory to load vars for managed_node3 8238 1726882405.46787: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882405.46796: Calling all_plugins_play to load vars for managed_node3 8238 1726882405.46799: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882405.46802: Calling groups_plugins_play to load vars for managed_node3 8238 1726882405.49063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882405.51317: done with get_vars() 8238 1726882405.51348: done getting variables 8238 1726882405.51416: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:25 -0400 (0:00:00.133) 0:00:35.669 ****** 8238 1726882405.51456: entering _queue_task() for managed_node3/package 8238 1726882405.52251: worker is 1 (out of 1 available) 8238 1726882405.52263: exiting _queue_task() for managed_node3/package 8238 1726882405.52273: done queuing things up, now waiting for results queue to drain 8238 1726882405.52275: waiting for pending results... 8238 1726882405.52596: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8238 1726882405.52719: in run() - task 0affc7ec-ae25-54bc-d334-000000000085 8238 1726882405.52933: variable 'ansible_search_path' from source: unknown 8238 1726882405.52943: variable 'ansible_search_path' from source: unknown 8238 1726882405.52987: calling self._execute() 8238 1726882405.53118: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.53134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.53154: variable 'omit' from source: magic vars 8238 1726882405.53649: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.53669: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882405.53835: variable 'network_state' from source: role '' defaults 8238 1726882405.53853: Evaluated conditional (network_state != {}): False 8238 1726882405.53863: when evaluation is False, skipping this task 8238 1726882405.53872: _execute() done 8238 1726882405.53880: dumping result to json 8238 1726882405.53890: done dumping result, returning 8238 1726882405.53901: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-54bc-d334-000000000085] 8238 1726882405.53919: sending task result for task 0affc7ec-ae25-54bc-d334-000000000085 8238 1726882405.54128: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000085 8238 1726882405.54131: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882405.54183: no more pending results, returning what we have 8238 1726882405.54187: results queue empty 8238 1726882405.54188: checking for any_errors_fatal 8238 1726882405.54194: done checking for any_errors_fatal 8238 1726882405.54195: checking for max_fail_percentage 8238 1726882405.54196: done checking for max_fail_percentage 8238 1726882405.54197: checking to see if all hosts have failed and the running result is not ok 8238 1726882405.54198: done checking to see if all hosts have failed 8238 1726882405.54199: getting the remaining hosts for this loop 8238 1726882405.54201: done getting the remaining hosts for this loop 8238 1726882405.54204: getting the next task for host managed_node3 8238 1726882405.54213: done getting next task for host managed_node3 8238 1726882405.54217: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8238 1726882405.54224: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882405.54246: getting variables 8238 1726882405.54248: in VariableManager get_vars() 8238 1726882405.54292: Calling all_inventory to load vars for managed_node3 8238 1726882405.54295: Calling groups_inventory to load vars for managed_node3 8238 1726882405.54297: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882405.54311: Calling all_plugins_play to load vars for managed_node3 8238 1726882405.54314: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882405.54318: Calling groups_plugins_play to load vars for managed_node3 8238 1726882405.55509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882405.56900: done with get_vars() 8238 1726882405.56927: done getting variables 8238 1726882405.56989: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:25 -0400 (0:00:00.055) 0:00:35.725 ****** 8238 1726882405.57029: entering _queue_task() for managed_node3/package 8238 1726882405.57306: worker is 1 (out of 1 available) 8238 1726882405.57318: exiting _queue_task() for managed_node3/package 8238 1726882405.57425: done queuing things up, now waiting for results queue to drain 8238 1726882405.57427: waiting for pending results... 8238 1726882405.57605: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8238 1726882405.57704: in run() - task 0affc7ec-ae25-54bc-d334-000000000086 8238 1726882405.57715: variable 'ansible_search_path' from source: unknown 8238 1726882405.57719: variable 'ansible_search_path' from source: unknown 8238 1726882405.57752: calling self._execute() 8238 1726882405.57831: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.57835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.57843: variable 'omit' from source: magic vars 8238 1726882405.58137: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.58150: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882405.58235: variable 'network_state' from source: role '' defaults 8238 1726882405.58244: Evaluated conditional (network_state != {}): False 8238 1726882405.58247: when evaluation is False, skipping this task 8238 1726882405.58252: _execute() done 8238 1726882405.58255: dumping result to json 8238 1726882405.58257: done dumping result, returning 8238 1726882405.58268: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-54bc-d334-000000000086] 8238 1726882405.58272: sending task result for task 0affc7ec-ae25-54bc-d334-000000000086 8238 1726882405.58370: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000086 8238 1726882405.58373: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882405.58419: no more pending results, returning what we have 8238 1726882405.58424: results queue empty 8238 1726882405.58425: checking for any_errors_fatal 8238 1726882405.58433: done checking for any_errors_fatal 8238 1726882405.58434: checking for max_fail_percentage 8238 1726882405.58435: done checking for max_fail_percentage 8238 1726882405.58436: checking to see if all hosts have failed and the running result is not ok 8238 1726882405.58437: done checking to see if all hosts have failed 8238 1726882405.58437: getting the remaining hosts for this loop 8238 1726882405.58439: done getting the remaining hosts for this loop 8238 1726882405.58442: getting the next task for host managed_node3 8238 1726882405.58448: done getting next task for host managed_node3 8238 1726882405.58452: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8238 1726882405.58455: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882405.58472: getting variables 8238 1726882405.58473: in VariableManager get_vars() 8238 1726882405.58506: Calling all_inventory to load vars for managed_node3 8238 1726882405.58509: Calling groups_inventory to load vars for managed_node3 8238 1726882405.58511: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882405.58520: Calling all_plugins_play to load vars for managed_node3 8238 1726882405.58525: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882405.58528: Calling groups_plugins_play to load vars for managed_node3 8238 1726882405.59601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882405.61310: done with get_vars() 8238 1726882405.61329: done getting variables 8238 1726882405.61376: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:25 -0400 (0:00:00.043) 0:00:35.769 ****** 8238 1726882405.61402: entering _queue_task() for managed_node3/service 8238 1726882405.61619: worker is 1 (out of 1 available) 8238 1726882405.61635: exiting _queue_task() for managed_node3/service 8238 1726882405.61647: done queuing things up, now waiting for results queue to drain 8238 1726882405.61648: waiting for pending results... 8238 1726882405.61829: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8238 1726882405.61938: in run() - task 0affc7ec-ae25-54bc-d334-000000000087 8238 1726882405.61950: variable 'ansible_search_path' from source: unknown 8238 1726882405.61954: variable 'ansible_search_path' from source: unknown 8238 1726882405.61990: calling self._execute() 8238 1726882405.62065: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.62071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.62079: variable 'omit' from source: magic vars 8238 1726882405.62370: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.62379: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882405.62504: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882405.62937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882405.64929: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882405.64984: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882405.65013: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882405.65043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882405.65066: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882405.65129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.65157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.65175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.65203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.65215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.65258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.65275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.65293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.65320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.65333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.65369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.65386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.65404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.65431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.65443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.65572: variable 'network_connections' from source: task vars 8238 1726882405.65581: variable 'port2_profile' from source: play vars 8238 1726882405.65632: variable 'port2_profile' from source: play vars 8238 1726882405.65640: variable 'port1_profile' from source: play vars 8238 1726882405.65688: variable 'port1_profile' from source: play vars 8238 1726882405.65695: variable 'controller_profile' from source: play vars 8238 1726882405.65741: variable 'controller_profile' from source: play vars 8238 1726882405.65797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882405.66038: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882405.66067: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882405.66091: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882405.66114: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882405.66150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882405.66168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882405.66186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.66205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882405.66248: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882405.66589: variable 'network_connections' from source: task vars 8238 1726882405.66593: variable 'port2_profile' from source: play vars 8238 1726882405.66595: variable 'port2_profile' from source: play vars 8238 1726882405.66598: variable 'port1_profile' from source: play vars 8238 1726882405.66732: variable 'port1_profile' from source: play vars 8238 1726882405.66735: variable 'controller_profile' from source: play vars 8238 1726882405.66738: variable 'controller_profile' from source: play vars 8238 1726882405.66741: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8238 1726882405.66750: when evaluation is False, skipping this task 8238 1726882405.66753: _execute() done 8238 1726882405.66758: dumping result to json 8238 1726882405.66761: done dumping result, returning 8238 1726882405.66763: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-54bc-d334-000000000087] 8238 1726882405.66765: sending task result for task 0affc7ec-ae25-54bc-d334-000000000087 8238 1726882405.66884: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000087 8238 1726882405.66888: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8238 1726882405.66949: no more pending results, returning what we have 8238 1726882405.66953: results queue empty 8238 1726882405.66957: checking for any_errors_fatal 8238 1726882405.66963: done checking for any_errors_fatal 8238 1726882405.66964: checking for max_fail_percentage 8238 1726882405.66966: done checking for max_fail_percentage 8238 1726882405.66967: checking to see if all hosts have failed and the running result is not ok 8238 1726882405.66968: done checking to see if all hosts have failed 8238 1726882405.66969: getting the remaining hosts for this loop 8238 1726882405.66971: done getting the remaining hosts for this loop 8238 1726882405.66975: getting the next task for host managed_node3 8238 1726882405.66984: done getting next task for host managed_node3 8238 1726882405.66989: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8238 1726882405.66993: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882405.67013: getting variables 8238 1726882405.67015: in VariableManager get_vars() 8238 1726882405.67108: Calling all_inventory to load vars for managed_node3 8238 1726882405.67111: Calling groups_inventory to load vars for managed_node3 8238 1726882405.67113: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882405.67126: Calling all_plugins_play to load vars for managed_node3 8238 1726882405.67129: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882405.67132: Calling groups_plugins_play to load vars for managed_node3 8238 1726882405.68539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882405.69697: done with get_vars() 8238 1726882405.69714: done getting variables 8238 1726882405.69764: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:25 -0400 (0:00:00.083) 0:00:35.853 ****** 8238 1726882405.69793: entering _queue_task() for managed_node3/service 8238 1726882405.70031: worker is 1 (out of 1 available) 8238 1726882405.70045: exiting _queue_task() for managed_node3/service 8238 1726882405.70057: done queuing things up, now waiting for results queue to drain 8238 1726882405.70059: waiting for pending results... 8238 1726882405.70249: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8238 1726882405.70354: in run() - task 0affc7ec-ae25-54bc-d334-000000000088 8238 1726882405.70368: variable 'ansible_search_path' from source: unknown 8238 1726882405.70372: variable 'ansible_search_path' from source: unknown 8238 1726882405.70407: calling self._execute() 8238 1726882405.70491: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.70497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.70509: variable 'omit' from source: magic vars 8238 1726882405.70806: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.70816: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882405.70942: variable 'network_provider' from source: set_fact 8238 1726882405.70948: variable 'network_state' from source: role '' defaults 8238 1726882405.70961: Evaluated conditional (network_provider == "nm" or network_state != {}): True 8238 1726882405.70966: variable 'omit' from source: magic vars 8238 1726882405.71016: variable 'omit' from source: magic vars 8238 1726882405.71040: variable 'network_service_name' from source: role '' defaults 8238 1726882405.71093: variable 'network_service_name' from source: role '' defaults 8238 1726882405.71175: variable '__network_provider_setup' from source: role '' defaults 8238 1726882405.71179: variable '__network_service_name_default_nm' from source: role '' defaults 8238 1726882405.71226: variable '__network_service_name_default_nm' from source: role '' defaults 8238 1726882405.71233: variable '__network_packages_default_nm' from source: role '' defaults 8238 1726882405.71285: variable '__network_packages_default_nm' from source: role '' defaults 8238 1726882405.71442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882405.73027: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882405.73081: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882405.73110: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882405.73140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882405.73163: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882405.73228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.73253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.73275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.73303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.73314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.73355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.73374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.73392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.73418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.73430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.73597: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8238 1726882405.73683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.73701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.73718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.73746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.73757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.73826: variable 'ansible_python' from source: facts 8238 1726882405.73843: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8238 1726882405.73906: variable '__network_wpa_supplicant_required' from source: role '' defaults 8238 1726882405.73965: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8238 1726882405.74059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.74079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.74097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.74129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.74140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.74179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882405.74199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882405.74220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.74250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882405.74263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882405.74363: variable 'network_connections' from source: task vars 8238 1726882405.74369: variable 'port2_profile' from source: play vars 8238 1726882405.74426: variable 'port2_profile' from source: play vars 8238 1726882405.74439: variable 'port1_profile' from source: play vars 8238 1726882405.74494: variable 'port1_profile' from source: play vars 8238 1726882405.74503: variable 'controller_profile' from source: play vars 8238 1726882405.74563: variable 'controller_profile' from source: play vars 8238 1726882405.74641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882405.74784: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882405.74821: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882405.74855: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882405.74892: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882405.74941: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882405.74966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882405.74994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882405.75019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882405.75062: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882405.75258: variable 'network_connections' from source: task vars 8238 1726882405.75267: variable 'port2_profile' from source: play vars 8238 1726882405.75326: variable 'port2_profile' from source: play vars 8238 1726882405.75336: variable 'port1_profile' from source: play vars 8238 1726882405.75391: variable 'port1_profile' from source: play vars 8238 1726882405.75401: variable 'controller_profile' from source: play vars 8238 1726882405.75459: variable 'controller_profile' from source: play vars 8238 1726882405.75486: variable '__network_packages_default_wireless' from source: role '' defaults 8238 1726882405.75546: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882405.75753: variable 'network_connections' from source: task vars 8238 1726882405.75756: variable 'port2_profile' from source: play vars 8238 1726882405.75808: variable 'port2_profile' from source: play vars 8238 1726882405.75814: variable 'port1_profile' from source: play vars 8238 1726882405.75871: variable 'port1_profile' from source: play vars 8238 1726882405.75878: variable 'controller_profile' from source: play vars 8238 1726882405.75929: variable 'controller_profile' from source: play vars 8238 1726882405.75947: variable '__network_packages_default_team' from source: role '' defaults 8238 1726882405.76009: variable '__network_team_connections_defined' from source: role '' defaults 8238 1726882405.76215: variable 'network_connections' from source: task vars 8238 1726882405.76218: variable 'port2_profile' from source: play vars 8238 1726882405.76274: variable 'port2_profile' from source: play vars 8238 1726882405.76283: variable 'port1_profile' from source: play vars 8238 1726882405.76336: variable 'port1_profile' from source: play vars 8238 1726882405.76342: variable 'controller_profile' from source: play vars 8238 1726882405.76397: variable 'controller_profile' from source: play vars 8238 1726882405.76440: variable '__network_service_name_default_initscripts' from source: role '' defaults 8238 1726882405.76487: variable '__network_service_name_default_initscripts' from source: role '' defaults 8238 1726882405.76492: variable '__network_packages_default_initscripts' from source: role '' defaults 8238 1726882405.76542: variable '__network_packages_default_initscripts' from source: role '' defaults 8238 1726882405.76694: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8238 1726882405.77050: variable 'network_connections' from source: task vars 8238 1726882405.77055: variable 'port2_profile' from source: play vars 8238 1726882405.77102: variable 'port2_profile' from source: play vars 8238 1726882405.77108: variable 'port1_profile' from source: play vars 8238 1726882405.77154: variable 'port1_profile' from source: play vars 8238 1726882405.77162: variable 'controller_profile' from source: play vars 8238 1726882405.77207: variable 'controller_profile' from source: play vars 8238 1726882405.77214: variable 'ansible_distribution' from source: facts 8238 1726882405.77217: variable '__network_rh_distros' from source: role '' defaults 8238 1726882405.77226: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.77238: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8238 1726882405.77363: variable 'ansible_distribution' from source: facts 8238 1726882405.77367: variable '__network_rh_distros' from source: role '' defaults 8238 1726882405.77373: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.77381: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8238 1726882405.77500: variable 'ansible_distribution' from source: facts 8238 1726882405.77510: variable '__network_rh_distros' from source: role '' defaults 8238 1726882405.77515: variable 'ansible_distribution_major_version' from source: facts 8238 1726882405.77542: variable 'network_provider' from source: set_fact 8238 1726882405.77601: variable 'omit' from source: magic vars 8238 1726882405.77605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882405.77607: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882405.77624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882405.77638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882405.77647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882405.77674: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882405.77678: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.77680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.77761: Set connection var ansible_connection to ssh 8238 1726882405.77764: Set connection var ansible_shell_type to sh 8238 1726882405.77767: Set connection var ansible_pipelining to False 8238 1726882405.77772: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882405.77778: Set connection var ansible_timeout to 10 8238 1726882405.77785: Set connection var ansible_shell_executable to /bin/sh 8238 1726882405.77804: variable 'ansible_shell_executable' from source: unknown 8238 1726882405.77807: variable 'ansible_connection' from source: unknown 8238 1726882405.77809: variable 'ansible_module_compression' from source: unknown 8238 1726882405.77813: variable 'ansible_shell_type' from source: unknown 8238 1726882405.77816: variable 'ansible_shell_executable' from source: unknown 8238 1726882405.77818: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882405.77820: variable 'ansible_pipelining' from source: unknown 8238 1726882405.77927: variable 'ansible_timeout' from source: unknown 8238 1726882405.77930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882405.77933: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882405.77937: variable 'omit' from source: magic vars 8238 1726882405.77939: starting attempt loop 8238 1726882405.77941: running the handler 8238 1726882405.77988: variable 'ansible_facts' from source: unknown 8238 1726882405.78595: _low_level_execute_command(): starting 8238 1726882405.78600: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882405.79133: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882405.79137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882405.79140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882405.79142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882405.79145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882405.79199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882405.79202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882405.79300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882405.81048: stdout chunk (state=3): >>>/root <<< 8238 1726882405.81166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882405.81214: stderr chunk (state=3): >>><<< 8238 1726882405.81219: stdout chunk (state=3): >>><<< 8238 1726882405.81241: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882405.81251: _low_level_execute_command(): starting 8238 1726882405.81257: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376 `" && echo ansible-tmp-1726882405.8124075-9644-197235666699376="` echo /root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376 `" ) && sleep 0' 8238 1726882405.81708: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882405.81711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882405.81714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882405.81716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882405.81718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882405.81777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882405.81784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882405.81860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882405.83863: stdout chunk (state=3): >>>ansible-tmp-1726882405.8124075-9644-197235666699376=/root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376 <<< 8238 1726882405.83972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882405.84017: stderr chunk (state=3): >>><<< 8238 1726882405.84020: stdout chunk (state=3): >>><<< 8238 1726882405.84040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882405.8124075-9644-197235666699376=/root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882405.84071: variable 'ansible_module_compression' from source: unknown 8238 1726882405.84109: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8238 1726882405.84170: variable 'ansible_facts' from source: unknown 8238 1726882405.84307: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/AnsiballZ_systemd.py 8238 1726882405.84419: Sending initial data 8238 1726882405.84425: Sent initial data (154 bytes) 8238 1726882405.84905: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882405.84908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882405.84910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882405.84913: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882405.84915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882405.84972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882405.84975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882405.84979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882405.85063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882405.86700: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882405.86796: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882405.86874: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpkz8yvico /root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/AnsiballZ_systemd.py <<< 8238 1726882405.86881: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/AnsiballZ_systemd.py" <<< 8238 1726882405.86957: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpkz8yvico" to remote "/root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/AnsiballZ_systemd.py" <<< 8238 1726882405.88254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882405.88308: stderr chunk (state=3): >>><<< 8238 1726882405.88312: stdout chunk (state=3): >>><<< 8238 1726882405.88336: done transferring module to remote 8238 1726882405.88346: _low_level_execute_command(): starting 8238 1726882405.88350: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/ /root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/AnsiballZ_systemd.py && sleep 0' 8238 1726882405.88951: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882405.89049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882405.90857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882405.90952: stderr chunk (state=3): >>><<< 8238 1726882405.90955: stdout chunk (state=3): >>><<< 8238 1726882405.90990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882405.91016: _low_level_execute_command(): starting 8238 1726882405.91049: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/AnsiballZ_systemd.py && sleep 0' 8238 1726882405.91566: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882405.91629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882405.91640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882405.91680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882405.91771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882406.24365: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11718656", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3532210176", "CPUUsageNSec": "803200000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8238 1726882406.26269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882406.26284: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 8238 1726882406.26330: stderr chunk (state=3): >>><<< 8238 1726882406.26333: stdout chunk (state=3): >>><<< 8238 1726882406.26348: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11718656", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3532210176", "CPUUsageNSec": "803200000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882406.26547: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882406.26567: _low_level_execute_command(): starting 8238 1726882406.26570: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882405.8124075-9644-197235666699376/ > /dev/null 2>&1 && sleep 0' 8238 1726882406.27013: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882406.27017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882406.27019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882406.27023: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882406.27026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882406.27079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882406.27084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882406.27176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882406.29088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882406.29140: stderr chunk (state=3): >>><<< 8238 1726882406.29143: stdout chunk (state=3): >>><<< 8238 1726882406.29159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882406.29162: handler run complete 8238 1726882406.29206: attempt loop complete, returning result 8238 1726882406.29210: _execute() done 8238 1726882406.29213: dumping result to json 8238 1726882406.29229: done dumping result, returning 8238 1726882406.29239: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-54bc-d334-000000000088] 8238 1726882406.29244: sending task result for task 0affc7ec-ae25-54bc-d334-000000000088 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882406.29698: no more pending results, returning what we have 8238 1726882406.29701: results queue empty 8238 1726882406.29702: checking for any_errors_fatal 8238 1726882406.29707: done checking for any_errors_fatal 8238 1726882406.29708: checking for max_fail_percentage 8238 1726882406.29709: done checking for max_fail_percentage 8238 1726882406.29710: checking to see if all hosts have failed and the running result is not ok 8238 1726882406.29711: done checking to see if all hosts have failed 8238 1726882406.29713: getting the remaining hosts for this loop 8238 1726882406.29715: done getting the remaining hosts for this loop 8238 1726882406.29719: getting the next task for host managed_node3 8238 1726882406.29730: done getting next task for host managed_node3 8238 1726882406.29744: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8238 1726882406.29749: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882406.29766: getting variables 8238 1726882406.29772: in VariableManager get_vars() 8238 1726882406.29827: Calling all_inventory to load vars for managed_node3 8238 1726882406.29830: Calling groups_inventory to load vars for managed_node3 8238 1726882406.29839: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882406.29867: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000088 8238 1726882406.29871: WORKER PROCESS EXITING 8238 1726882406.29886: Calling all_plugins_play to load vars for managed_node3 8238 1726882406.29890: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882406.29894: Calling groups_plugins_play to load vars for managed_node3 8238 1726882406.31900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882406.33509: done with get_vars() 8238 1726882406.33528: done getting variables 8238 1726882406.33574: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:26 -0400 (0:00:00.638) 0:00:36.491 ****** 8238 1726882406.33606: entering _queue_task() for managed_node3/service 8238 1726882406.33844: worker is 1 (out of 1 available) 8238 1726882406.33859: exiting _queue_task() for managed_node3/service 8238 1726882406.33873: done queuing things up, now waiting for results queue to drain 8238 1726882406.33875: waiting for pending results... 8238 1726882406.34116: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8238 1726882406.34302: in run() - task 0affc7ec-ae25-54bc-d334-000000000089 8238 1726882406.34332: variable 'ansible_search_path' from source: unknown 8238 1726882406.34527: variable 'ansible_search_path' from source: unknown 8238 1726882406.34531: calling self._execute() 8238 1726882406.34534: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882406.34536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882406.34538: variable 'omit' from source: magic vars 8238 1726882406.34932: variable 'ansible_distribution_major_version' from source: facts 8238 1726882406.34950: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882406.35085: variable 'network_provider' from source: set_fact 8238 1726882406.35092: Evaluated conditional (network_provider == "nm"): True 8238 1726882406.35194: variable '__network_wpa_supplicant_required' from source: role '' defaults 8238 1726882406.35327: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8238 1726882406.35462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882406.37596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882406.37677: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882406.37709: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882406.37754: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882406.37773: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882406.37879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882406.37895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882406.37930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882406.37964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882406.37977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882406.38033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882406.38067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882406.38106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882406.38147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882406.38163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882406.38205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882406.38221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882406.38257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882406.38283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882406.38298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882406.38482: variable 'network_connections' from source: task vars 8238 1726882406.38504: variable 'port2_profile' from source: play vars 8238 1726882406.38701: variable 'port2_profile' from source: play vars 8238 1726882406.38798: variable 'port1_profile' from source: play vars 8238 1726882406.38890: variable 'port1_profile' from source: play vars 8238 1726882406.38907: variable 'controller_profile' from source: play vars 8238 1726882406.38984: variable 'controller_profile' from source: play vars 8238 1726882406.39116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8238 1726882406.39359: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8238 1726882406.39392: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8238 1726882406.39509: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8238 1726882406.39547: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8238 1726882406.39597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8238 1726882406.39629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8238 1726882406.39663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882406.39721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8238 1726882406.39778: variable '__network_wireless_connections_defined' from source: role '' defaults 8238 1726882406.40095: variable 'network_connections' from source: task vars 8238 1726882406.40239: variable 'port2_profile' from source: play vars 8238 1726882406.40246: variable 'port2_profile' from source: play vars 8238 1726882406.40248: variable 'port1_profile' from source: play vars 8238 1726882406.40253: variable 'port1_profile' from source: play vars 8238 1726882406.40265: variable 'controller_profile' from source: play vars 8238 1726882406.40341: variable 'controller_profile' from source: play vars 8238 1726882406.40380: Evaluated conditional (__network_wpa_supplicant_required): False 8238 1726882406.40429: when evaluation is False, skipping this task 8238 1726882406.40432: _execute() done 8238 1726882406.40435: dumping result to json 8238 1726882406.40437: done dumping result, returning 8238 1726882406.40440: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-54bc-d334-000000000089] 8238 1726882406.40442: sending task result for task 0affc7ec-ae25-54bc-d334-000000000089 8238 1726882406.40528: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000089 8238 1726882406.40531: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 8238 1726882406.40598: no more pending results, returning what we have 8238 1726882406.40605: results queue empty 8238 1726882406.40606: checking for any_errors_fatal 8238 1726882406.40636: done checking for any_errors_fatal 8238 1726882406.40637: checking for max_fail_percentage 8238 1726882406.40639: done checking for max_fail_percentage 8238 1726882406.40642: checking to see if all hosts have failed and the running result is not ok 8238 1726882406.40643: done checking to see if all hosts have failed 8238 1726882406.40644: getting the remaining hosts for this loop 8238 1726882406.40645: done getting the remaining hosts for this loop 8238 1726882406.40651: getting the next task for host managed_node3 8238 1726882406.40663: done getting next task for host managed_node3 8238 1726882406.40668: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 8238 1726882406.40675: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882406.40694: getting variables 8238 1726882406.40698: in VariableManager get_vars() 8238 1726882406.40765: Calling all_inventory to load vars for managed_node3 8238 1726882406.40769: Calling groups_inventory to load vars for managed_node3 8238 1726882406.40771: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882406.40781: Calling all_plugins_play to load vars for managed_node3 8238 1726882406.40784: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882406.40787: Calling groups_plugins_play to load vars for managed_node3 8238 1726882406.42283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882406.43706: done with get_vars() 8238 1726882406.43726: done getting variables 8238 1726882406.43793: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:26 -0400 (0:00:00.102) 0:00:36.593 ****** 8238 1726882406.43840: entering _queue_task() for managed_node3/service 8238 1726882406.44163: worker is 1 (out of 1 available) 8238 1726882406.44177: exiting _queue_task() for managed_node3/service 8238 1726882406.44188: done queuing things up, now waiting for results queue to drain 8238 1726882406.44189: waiting for pending results... 8238 1726882406.44423: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 8238 1726882406.44518: in run() - task 0affc7ec-ae25-54bc-d334-00000000008a 8238 1726882406.44560: variable 'ansible_search_path' from source: unknown 8238 1726882406.44564: variable 'ansible_search_path' from source: unknown 8238 1726882406.44604: calling self._execute() 8238 1726882406.44683: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882406.44717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882406.44720: variable 'omit' from source: magic vars 8238 1726882406.45028: variable 'ansible_distribution_major_version' from source: facts 8238 1726882406.45040: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882406.45166: variable 'network_provider' from source: set_fact 8238 1726882406.45170: Evaluated conditional (network_provider == "initscripts"): False 8238 1726882406.45172: when evaluation is False, skipping this task 8238 1726882406.45175: _execute() done 8238 1726882406.45177: dumping result to json 8238 1726882406.45182: done dumping result, returning 8238 1726882406.45189: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-54bc-d334-00000000008a] 8238 1726882406.45195: sending task result for task 0affc7ec-ae25-54bc-d334-00000000008a 8238 1726882406.45329: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000008a 8238 1726882406.45332: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8238 1726882406.45390: no more pending results, returning what we have 8238 1726882406.45394: results queue empty 8238 1726882406.45395: checking for any_errors_fatal 8238 1726882406.45402: done checking for any_errors_fatal 8238 1726882406.45403: checking for max_fail_percentage 8238 1726882406.45405: done checking for max_fail_percentage 8238 1726882406.45405: checking to see if all hosts have failed and the running result is not ok 8238 1726882406.45406: done checking to see if all hosts have failed 8238 1726882406.45407: getting the remaining hosts for this loop 8238 1726882406.45408: done getting the remaining hosts for this loop 8238 1726882406.45412: getting the next task for host managed_node3 8238 1726882406.45418: done getting next task for host managed_node3 8238 1726882406.45424: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8238 1726882406.45428: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882406.45445: getting variables 8238 1726882406.45447: in VariableManager get_vars() 8238 1726882406.45480: Calling all_inventory to load vars for managed_node3 8238 1726882406.45483: Calling groups_inventory to load vars for managed_node3 8238 1726882406.45485: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882406.45493: Calling all_plugins_play to load vars for managed_node3 8238 1726882406.45496: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882406.45499: Calling groups_plugins_play to load vars for managed_node3 8238 1726882406.46857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882406.48117: done with get_vars() 8238 1726882406.48135: done getting variables 8238 1726882406.48185: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:26 -0400 (0:00:00.043) 0:00:36.637 ****** 8238 1726882406.48215: entering _queue_task() for managed_node3/copy 8238 1726882406.48421: worker is 1 (out of 1 available) 8238 1726882406.48437: exiting _queue_task() for managed_node3/copy 8238 1726882406.48449: done queuing things up, now waiting for results queue to drain 8238 1726882406.48451: waiting for pending results... 8238 1726882406.48634: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8238 1726882406.48725: in run() - task 0affc7ec-ae25-54bc-d334-00000000008b 8238 1726882406.48738: variable 'ansible_search_path' from source: unknown 8238 1726882406.48741: variable 'ansible_search_path' from source: unknown 8238 1726882406.48827: calling self._execute() 8238 1726882406.48878: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882406.48882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882406.48892: variable 'omit' from source: magic vars 8238 1726882406.49220: variable 'ansible_distribution_major_version' from source: facts 8238 1726882406.49232: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882406.49345: variable 'network_provider' from source: set_fact 8238 1726882406.49349: Evaluated conditional (network_provider == "initscripts"): False 8238 1726882406.49352: when evaluation is False, skipping this task 8238 1726882406.49356: _execute() done 8238 1726882406.49359: dumping result to json 8238 1726882406.49368: done dumping result, returning 8238 1726882406.49372: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-54bc-d334-00000000008b] 8238 1726882406.49378: sending task result for task 0affc7ec-ae25-54bc-d334-00000000008b 8238 1726882406.49490: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000008b 8238 1726882406.49493: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8238 1726882406.49545: no more pending results, returning what we have 8238 1726882406.49549: results queue empty 8238 1726882406.49549: checking for any_errors_fatal 8238 1726882406.49553: done checking for any_errors_fatal 8238 1726882406.49554: checking for max_fail_percentage 8238 1726882406.49555: done checking for max_fail_percentage 8238 1726882406.49556: checking to see if all hosts have failed and the running result is not ok 8238 1726882406.49557: done checking to see if all hosts have failed 8238 1726882406.49558: getting the remaining hosts for this loop 8238 1726882406.49559: done getting the remaining hosts for this loop 8238 1726882406.49562: getting the next task for host managed_node3 8238 1726882406.49569: done getting next task for host managed_node3 8238 1726882406.49572: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8238 1726882406.49576: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882406.49594: getting variables 8238 1726882406.49595: in VariableManager get_vars() 8238 1726882406.49638: Calling all_inventory to load vars for managed_node3 8238 1726882406.49641: Calling groups_inventory to load vars for managed_node3 8238 1726882406.49647: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882406.49657: Calling all_plugins_play to load vars for managed_node3 8238 1726882406.49664: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882406.49670: Calling groups_plugins_play to load vars for managed_node3 8238 1726882406.50781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882406.52362: done with get_vars() 8238 1726882406.52387: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:26 -0400 (0:00:00.042) 0:00:36.679 ****** 8238 1726882406.52487: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 8238 1726882406.52789: worker is 1 (out of 1 available) 8238 1726882406.52804: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 8238 1726882406.52819: done queuing things up, now waiting for results queue to drain 8238 1726882406.52821: waiting for pending results... 8238 1726882406.53441: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8238 1726882406.53486: in run() - task 0affc7ec-ae25-54bc-d334-00000000008c 8238 1726882406.53508: variable 'ansible_search_path' from source: unknown 8238 1726882406.53517: variable 'ansible_search_path' from source: unknown 8238 1726882406.53572: calling self._execute() 8238 1726882406.53684: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882406.53698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882406.53719: variable 'omit' from source: magic vars 8238 1726882406.54167: variable 'ansible_distribution_major_version' from source: facts 8238 1726882406.54186: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882406.54201: variable 'omit' from source: magic vars 8238 1726882406.54288: variable 'omit' from source: magic vars 8238 1726882406.54482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8238 1726882406.57168: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8238 1726882406.57246: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8238 1726882406.57298: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8238 1726882406.57344: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8238 1726882406.57383: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8238 1726882406.57471: variable 'network_provider' from source: set_fact 8238 1726882406.57800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8238 1726882406.57803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8238 1726882406.57813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8238 1726882406.57817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8238 1726882406.57819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8238 1726882406.57838: variable 'omit' from source: magic vars 8238 1726882406.57928: variable 'omit' from source: magic vars 8238 1726882406.58130: variable 'network_connections' from source: task vars 8238 1726882406.58135: variable 'port2_profile' from source: play vars 8238 1726882406.58138: variable 'port2_profile' from source: play vars 8238 1726882406.58140: variable 'port1_profile' from source: play vars 8238 1726882406.58206: variable 'port1_profile' from source: play vars 8238 1726882406.58221: variable 'controller_profile' from source: play vars 8238 1726882406.58366: variable 'controller_profile' from source: play vars 8238 1726882406.58507: variable 'omit' from source: magic vars 8238 1726882406.58520: variable '__lsr_ansible_managed' from source: task vars 8238 1726882406.58602: variable '__lsr_ansible_managed' from source: task vars 8238 1726882406.58816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8238 1726882406.59080: Loaded config def from plugin (lookup/template) 8238 1726882406.59091: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 8238 1726882406.59128: File lookup term: get_ansible_managed.j2 8238 1726882406.59142: variable 'ansible_search_path' from source: unknown 8238 1726882406.59153: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 8238 1726882406.59178: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 8238 1726882406.59192: variable 'ansible_search_path' from source: unknown 8238 1726882406.64687: variable 'ansible_managed' from source: unknown 8238 1726882406.64791: variable 'omit' from source: magic vars 8238 1726882406.64815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882406.64839: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882406.64852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882406.64868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882406.64876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882406.64899: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882406.64902: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882406.64905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882406.64979: Set connection var ansible_connection to ssh 8238 1726882406.64982: Set connection var ansible_shell_type to sh 8238 1726882406.64986: Set connection var ansible_pipelining to False 8238 1726882406.64992: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882406.64998: Set connection var ansible_timeout to 10 8238 1726882406.65005: Set connection var ansible_shell_executable to /bin/sh 8238 1726882406.65025: variable 'ansible_shell_executable' from source: unknown 8238 1726882406.65029: variable 'ansible_connection' from source: unknown 8238 1726882406.65031: variable 'ansible_module_compression' from source: unknown 8238 1726882406.65036: variable 'ansible_shell_type' from source: unknown 8238 1726882406.65038: variable 'ansible_shell_executable' from source: unknown 8238 1726882406.65040: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882406.65043: variable 'ansible_pipelining' from source: unknown 8238 1726882406.65046: variable 'ansible_timeout' from source: unknown 8238 1726882406.65056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882406.65149: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882406.65157: variable 'omit' from source: magic vars 8238 1726882406.65173: starting attempt loop 8238 1726882406.65177: running the handler 8238 1726882406.65181: _low_level_execute_command(): starting 8238 1726882406.65188: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882406.65813: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882406.65906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882406.65910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882406.65943: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882406.65985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882406.66076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882406.67834: stdout chunk (state=3): >>>/root <<< 8238 1726882406.67996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882406.67998: stdout chunk (state=3): >>><<< 8238 1726882406.68000: stderr chunk (state=3): >>><<< 8238 1726882406.68029: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882406.68033: _low_level_execute_command(): starting 8238 1726882406.68038: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486 `" && echo ansible-tmp-1726882406.6801972-9680-144571727502486="` echo /root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486 `" ) && sleep 0' 8238 1726882406.68662: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882406.68666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882406.68753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882406.70712: stdout chunk (state=3): >>>ansible-tmp-1726882406.6801972-9680-144571727502486=/root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486 <<< 8238 1726882406.70829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882406.70874: stderr chunk (state=3): >>><<< 8238 1726882406.70877: stdout chunk (state=3): >>><<< 8238 1726882406.70894: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882406.6801972-9680-144571727502486=/root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882406.70934: variable 'ansible_module_compression' from source: unknown 8238 1726882406.70991: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 8238 1726882406.71045: variable 'ansible_facts' from source: unknown 8238 1726882406.71228: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/AnsiballZ_network_connections.py 8238 1726882406.71435: Sending initial data 8238 1726882406.71438: Sent initial data (166 bytes) 8238 1726882406.71888: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882406.71897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882406.71906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882406.71939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882406.72028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882406.72045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882406.72092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882406.72185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882406.73758: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 8238 1726882406.73769: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882406.73844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882406.73930: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpvwp72p9m /root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/AnsiballZ_network_connections.py <<< 8238 1726882406.73933: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/AnsiballZ_network_connections.py" <<< 8238 1726882406.74018: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpvwp72p9m" to remote "/root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/AnsiballZ_network_connections.py" <<< 8238 1726882406.74021: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/AnsiballZ_network_connections.py" <<< 8238 1726882406.75188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882406.75191: stdout chunk (state=3): >>><<< 8238 1726882406.75194: stderr chunk (state=3): >>><<< 8238 1726882406.75196: done transferring module to remote 8238 1726882406.75198: _low_level_execute_command(): starting 8238 1726882406.75200: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/ /root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/AnsiballZ_network_connections.py && sleep 0' 8238 1726882406.75648: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882406.75651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882406.75657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882406.75659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882406.75661: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882406.75709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882406.75712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882406.75801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882406.77592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882406.77636: stderr chunk (state=3): >>><<< 8238 1726882406.77640: stdout chunk (state=3): >>><<< 8238 1726882406.77652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882406.77656: _low_level_execute_command(): starting 8238 1726882406.77663: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/AnsiballZ_network_connections.py && sleep 0' 8238 1726882406.78072: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882406.78075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882406.78077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882406.78080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882406.78128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882406.78131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882406.78220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882407.42338: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 8238 1726882407.42414: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/bcdb074f-88c7-45b9-82b3-bdb89677858d: error=unknown <<< 8238 1726882407.45148: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 8238 1726882407.45175: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail<<< 8238 1726882407.45190: stdout chunk (state=3): >>> ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/8b1930de-0635-4914-bc1a-96ab6cfe44b6: error=unknown<<< 8238 1726882407.45206: stdout chunk (state=3): >>> <<< 8238 1726882407.47007: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 8238 1726882407.47011: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/8c66fb3e-08a1-411c-8a8a-97eb75d3c57e: error=unknown <<< 8238 1726882407.47252: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 8238 1726882407.49265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882407.49325: stderr chunk (state=3): >>><<< 8238 1726882407.49328: stdout chunk (state=3): >>><<< 8238 1726882407.49348: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/bcdb074f-88c7-45b9-82b3-bdb89677858d: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/8b1930de-0635-4914-bc1a-96ab6cfe44b6: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ifq7dkoo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/8c66fb3e-08a1-411c-8a8a-97eb75d3c57e: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882407.49390: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882407.49398: _low_level_execute_command(): starting 8238 1726882407.49403: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882406.6801972-9680-144571727502486/ > /dev/null 2>&1 && sleep 0' 8238 1726882407.49889: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882407.49893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.49895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882407.49899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882407.49901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.49952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882407.49960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882407.49962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882407.50044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882407.51994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882407.52044: stderr chunk (state=3): >>><<< 8238 1726882407.52049: stdout chunk (state=3): >>><<< 8238 1726882407.52065: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882407.52071: handler run complete 8238 1726882407.52094: attempt loop complete, returning result 8238 1726882407.52097: _execute() done 8238 1726882407.52100: dumping result to json 8238 1726882407.52107: done dumping result, returning 8238 1726882407.52115: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-54bc-d334-00000000008c] 8238 1726882407.52120: sending task result for task 0affc7ec-ae25-54bc-d334-00000000008c 8238 1726882407.52235: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000008c 8238 1726882407.52238: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 8238 1726882407.52361: no more pending results, returning what we have 8238 1726882407.52364: results queue empty 8238 1726882407.52365: checking for any_errors_fatal 8238 1726882407.52372: done checking for any_errors_fatal 8238 1726882407.52372: checking for max_fail_percentage 8238 1726882407.52374: done checking for max_fail_percentage 8238 1726882407.52375: checking to see if all hosts have failed and the running result is not ok 8238 1726882407.52376: done checking to see if all hosts have failed 8238 1726882407.52376: getting the remaining hosts for this loop 8238 1726882407.52378: done getting the remaining hosts for this loop 8238 1726882407.52381: getting the next task for host managed_node3 8238 1726882407.52388: done getting next task for host managed_node3 8238 1726882407.52391: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 8238 1726882407.52396: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882407.52406: getting variables 8238 1726882407.52407: in VariableManager get_vars() 8238 1726882407.52453: Calling all_inventory to load vars for managed_node3 8238 1726882407.52459: Calling groups_inventory to load vars for managed_node3 8238 1726882407.52461: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882407.52475: Calling all_plugins_play to load vars for managed_node3 8238 1726882407.52477: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882407.52480: Calling groups_plugins_play to load vars for managed_node3 8238 1726882407.53488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882407.54661: done with get_vars() 8238 1726882407.54681: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:27 -0400 (0:00:01.022) 0:00:37.702 ****** 8238 1726882407.54759: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 8238 1726882407.55035: worker is 1 (out of 1 available) 8238 1726882407.55049: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 8238 1726882407.55066: done queuing things up, now waiting for results queue to drain 8238 1726882407.55068: waiting for pending results... 8238 1726882407.55261: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 8238 1726882407.55361: in run() - task 0affc7ec-ae25-54bc-d334-00000000008d 8238 1726882407.55373: variable 'ansible_search_path' from source: unknown 8238 1726882407.55376: variable 'ansible_search_path' from source: unknown 8238 1726882407.55407: calling self._execute() 8238 1726882407.55484: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.55489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.55499: variable 'omit' from source: magic vars 8238 1726882407.55798: variable 'ansible_distribution_major_version' from source: facts 8238 1726882407.55808: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882407.55898: variable 'network_state' from source: role '' defaults 8238 1726882407.55906: Evaluated conditional (network_state != {}): False 8238 1726882407.55909: when evaluation is False, skipping this task 8238 1726882407.55912: _execute() done 8238 1726882407.55915: dumping result to json 8238 1726882407.55920: done dumping result, returning 8238 1726882407.55929: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-54bc-d334-00000000008d] 8238 1726882407.55935: sending task result for task 0affc7ec-ae25-54bc-d334-00000000008d 8238 1726882407.56029: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000008d 8238 1726882407.56031: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8238 1726882407.56106: no more pending results, returning what we have 8238 1726882407.56110: results queue empty 8238 1726882407.56112: checking for any_errors_fatal 8238 1726882407.56120: done checking for any_errors_fatal 8238 1726882407.56120: checking for max_fail_percentage 8238 1726882407.56123: done checking for max_fail_percentage 8238 1726882407.56124: checking to see if all hosts have failed and the running result is not ok 8238 1726882407.56125: done checking to see if all hosts have failed 8238 1726882407.56126: getting the remaining hosts for this loop 8238 1726882407.56128: done getting the remaining hosts for this loop 8238 1726882407.56132: getting the next task for host managed_node3 8238 1726882407.56138: done getting next task for host managed_node3 8238 1726882407.56144: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8238 1726882407.56147: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882407.56166: getting variables 8238 1726882407.56168: in VariableManager get_vars() 8238 1726882407.56201: Calling all_inventory to load vars for managed_node3 8238 1726882407.56204: Calling groups_inventory to load vars for managed_node3 8238 1726882407.56206: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882407.56214: Calling all_plugins_play to load vars for managed_node3 8238 1726882407.56217: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882407.56219: Calling groups_plugins_play to load vars for managed_node3 8238 1726882407.57306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882407.58980: done with get_vars() 8238 1726882407.59009: done getting variables 8238 1726882407.59085: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:27 -0400 (0:00:00.043) 0:00:37.746 ****** 8238 1726882407.59124: entering _queue_task() for managed_node3/debug 8238 1726882407.59431: worker is 1 (out of 1 available) 8238 1726882407.59446: exiting _queue_task() for managed_node3/debug 8238 1726882407.59463: done queuing things up, now waiting for results queue to drain 8238 1726882407.59465: waiting for pending results... 8238 1726882407.59845: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8238 1726882407.59963: in run() - task 0affc7ec-ae25-54bc-d334-00000000008e 8238 1726882407.59987: variable 'ansible_search_path' from source: unknown 8238 1726882407.59995: variable 'ansible_search_path' from source: unknown 8238 1726882407.60051: calling self._execute() 8238 1726882407.60168: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.60183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.60201: variable 'omit' from source: magic vars 8238 1726882407.60934: variable 'ansible_distribution_major_version' from source: facts 8238 1726882407.60978: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882407.61060: variable 'omit' from source: magic vars 8238 1726882407.61083: variable 'omit' from source: magic vars 8238 1726882407.61133: variable 'omit' from source: magic vars 8238 1726882407.61191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882407.61240: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882407.61275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882407.61303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882407.61323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882407.61365: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882407.61383: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.61386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.61528: Set connection var ansible_connection to ssh 8238 1726882407.61531: Set connection var ansible_shell_type to sh 8238 1726882407.61534: Set connection var ansible_pipelining to False 8238 1726882407.61536: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882407.61546: Set connection var ansible_timeout to 10 8238 1726882407.61601: Set connection var ansible_shell_executable to /bin/sh 8238 1726882407.61604: variable 'ansible_shell_executable' from source: unknown 8238 1726882407.61607: variable 'ansible_connection' from source: unknown 8238 1726882407.61611: variable 'ansible_module_compression' from source: unknown 8238 1726882407.61613: variable 'ansible_shell_type' from source: unknown 8238 1726882407.61615: variable 'ansible_shell_executable' from source: unknown 8238 1726882407.61621: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.61633: variable 'ansible_pipelining' from source: unknown 8238 1726882407.61640: variable 'ansible_timeout' from source: unknown 8238 1726882407.61647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.61808: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882407.61928: variable 'omit' from source: magic vars 8238 1726882407.61932: starting attempt loop 8238 1726882407.61934: running the handler 8238 1726882407.61996: variable '__network_connections_result' from source: set_fact 8238 1726882407.62089: handler run complete 8238 1726882407.62114: attempt loop complete, returning result 8238 1726882407.62124: _execute() done 8238 1726882407.62133: dumping result to json 8238 1726882407.62146: done dumping result, returning 8238 1726882407.62164: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-54bc-d334-00000000008e] 8238 1726882407.62176: sending task result for task 0affc7ec-ae25-54bc-d334-00000000008e 8238 1726882407.62529: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000008e 8238 1726882407.62533: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 8238 1726882407.62605: no more pending results, returning what we have 8238 1726882407.62609: results queue empty 8238 1726882407.62610: checking for any_errors_fatal 8238 1726882407.62616: done checking for any_errors_fatal 8238 1726882407.62617: checking for max_fail_percentage 8238 1726882407.62619: done checking for max_fail_percentage 8238 1726882407.62620: checking to see if all hosts have failed and the running result is not ok 8238 1726882407.62621: done checking to see if all hosts have failed 8238 1726882407.62623: getting the remaining hosts for this loop 8238 1726882407.62625: done getting the remaining hosts for this loop 8238 1726882407.62629: getting the next task for host managed_node3 8238 1726882407.62636: done getting next task for host managed_node3 8238 1726882407.62641: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8238 1726882407.62645: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882407.62660: getting variables 8238 1726882407.62662: in VariableManager get_vars() 8238 1726882407.62702: Calling all_inventory to load vars for managed_node3 8238 1726882407.62706: Calling groups_inventory to load vars for managed_node3 8238 1726882407.62708: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882407.62719: Calling all_plugins_play to load vars for managed_node3 8238 1726882407.62739: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882407.62745: Calling groups_plugins_play to load vars for managed_node3 8238 1726882407.64194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882407.66747: done with get_vars() 8238 1726882407.66768: done getting variables 8238 1726882407.66839: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:27 -0400 (0:00:00.077) 0:00:37.823 ****** 8238 1726882407.66868: entering _queue_task() for managed_node3/debug 8238 1726882407.67148: worker is 1 (out of 1 available) 8238 1726882407.67167: exiting _queue_task() for managed_node3/debug 8238 1726882407.67180: done queuing things up, now waiting for results queue to drain 8238 1726882407.67181: waiting for pending results... 8238 1726882407.67375: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8238 1726882407.67480: in run() - task 0affc7ec-ae25-54bc-d334-00000000008f 8238 1726882407.67494: variable 'ansible_search_path' from source: unknown 8238 1726882407.67498: variable 'ansible_search_path' from source: unknown 8238 1726882407.67532: calling self._execute() 8238 1726882407.67605: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.67612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.67623: variable 'omit' from source: magic vars 8238 1726882407.67915: variable 'ansible_distribution_major_version' from source: facts 8238 1726882407.67927: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882407.67933: variable 'omit' from source: magic vars 8238 1726882407.67986: variable 'omit' from source: magic vars 8238 1726882407.68015: variable 'omit' from source: magic vars 8238 1726882407.68051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882407.68085: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882407.68101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882407.68115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882407.68125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882407.68153: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882407.68159: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.68161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.68236: Set connection var ansible_connection to ssh 8238 1726882407.68239: Set connection var ansible_shell_type to sh 8238 1726882407.68242: Set connection var ansible_pipelining to False 8238 1726882407.68249: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882407.68257: Set connection var ansible_timeout to 10 8238 1726882407.68262: Set connection var ansible_shell_executable to /bin/sh 8238 1726882407.68285: variable 'ansible_shell_executable' from source: unknown 8238 1726882407.68289: variable 'ansible_connection' from source: unknown 8238 1726882407.68293: variable 'ansible_module_compression' from source: unknown 8238 1726882407.68295: variable 'ansible_shell_type' from source: unknown 8238 1726882407.68298: variable 'ansible_shell_executable' from source: unknown 8238 1726882407.68300: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.68302: variable 'ansible_pipelining' from source: unknown 8238 1726882407.68304: variable 'ansible_timeout' from source: unknown 8238 1726882407.68306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.68420: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882407.68458: variable 'omit' from source: magic vars 8238 1726882407.68462: starting attempt loop 8238 1726882407.68464: running the handler 8238 1726882407.68491: variable '__network_connections_result' from source: set_fact 8238 1726882407.68661: variable '__network_connections_result' from source: set_fact 8238 1726882407.68757: handler run complete 8238 1726882407.68783: attempt loop complete, returning result 8238 1726882407.68786: _execute() done 8238 1726882407.68789: dumping result to json 8238 1726882407.68821: done dumping result, returning 8238 1726882407.68828: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-54bc-d334-00000000008f] 8238 1726882407.68830: sending task result for task 0affc7ec-ae25-54bc-d334-00000000008f ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 8238 1726882407.69047: no more pending results, returning what we have 8238 1726882407.69052: results queue empty 8238 1726882407.69054: checking for any_errors_fatal 8238 1726882407.69063: done checking for any_errors_fatal 8238 1726882407.69064: checking for max_fail_percentage 8238 1726882407.69066: done checking for max_fail_percentage 8238 1726882407.69067: checking to see if all hosts have failed and the running result is not ok 8238 1726882407.69068: done checking to see if all hosts have failed 8238 1726882407.69069: getting the remaining hosts for this loop 8238 1726882407.69071: done getting the remaining hosts for this loop 8238 1726882407.69075: getting the next task for host managed_node3 8238 1726882407.69085: done getting next task for host managed_node3 8238 1726882407.69090: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8238 1726882407.69094: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882407.69105: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000008f 8238 1726882407.69107: WORKER PROCESS EXITING 8238 1726882407.69116: getting variables 8238 1726882407.69117: in VariableManager get_vars() 8238 1726882407.69168: Calling all_inventory to load vars for managed_node3 8238 1726882407.69171: Calling groups_inventory to load vars for managed_node3 8238 1726882407.69173: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882407.69183: Calling all_plugins_play to load vars for managed_node3 8238 1726882407.69192: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882407.69195: Calling groups_plugins_play to load vars for managed_node3 8238 1726882407.70540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882407.71691: done with get_vars() 8238 1726882407.71710: done getting variables 8238 1726882407.71763: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:27 -0400 (0:00:00.049) 0:00:37.872 ****** 8238 1726882407.71790: entering _queue_task() for managed_node3/debug 8238 1726882407.72137: worker is 1 (out of 1 available) 8238 1726882407.72149: exiting _queue_task() for managed_node3/debug 8238 1726882407.72163: done queuing things up, now waiting for results queue to drain 8238 1726882407.72166: waiting for pending results... 8238 1726882407.72531: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8238 1726882407.72719: in run() - task 0affc7ec-ae25-54bc-d334-000000000090 8238 1726882407.72746: variable 'ansible_search_path' from source: unknown 8238 1726882407.72825: variable 'ansible_search_path' from source: unknown 8238 1726882407.72831: calling self._execute() 8238 1726882407.72947: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.72951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.72954: variable 'omit' from source: magic vars 8238 1726882407.73413: variable 'ansible_distribution_major_version' from source: facts 8238 1726882407.73428: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882407.73516: variable 'network_state' from source: role '' defaults 8238 1726882407.73531: Evaluated conditional (network_state != {}): False 8238 1726882407.73535: when evaluation is False, skipping this task 8238 1726882407.73538: _execute() done 8238 1726882407.73541: dumping result to json 8238 1726882407.73544: done dumping result, returning 8238 1726882407.73547: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-54bc-d334-000000000090] 8238 1726882407.73554: sending task result for task 0affc7ec-ae25-54bc-d334-000000000090 8238 1726882407.73650: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000090 8238 1726882407.73653: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 8238 1726882407.73700: no more pending results, returning what we have 8238 1726882407.73705: results queue empty 8238 1726882407.73706: checking for any_errors_fatal 8238 1726882407.73717: done checking for any_errors_fatal 8238 1726882407.73718: checking for max_fail_percentage 8238 1726882407.73720: done checking for max_fail_percentage 8238 1726882407.73721: checking to see if all hosts have failed and the running result is not ok 8238 1726882407.73730: done checking to see if all hosts have failed 8238 1726882407.73731: getting the remaining hosts for this loop 8238 1726882407.73733: done getting the remaining hosts for this loop 8238 1726882407.73737: getting the next task for host managed_node3 8238 1726882407.73744: done getting next task for host managed_node3 8238 1726882407.73749: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 8238 1726882407.73753: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882407.73772: getting variables 8238 1726882407.73774: in VariableManager get_vars() 8238 1726882407.73810: Calling all_inventory to load vars for managed_node3 8238 1726882407.73813: Calling groups_inventory to load vars for managed_node3 8238 1726882407.73815: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882407.73826: Calling all_plugins_play to load vars for managed_node3 8238 1726882407.73829: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882407.73832: Calling groups_plugins_play to load vars for managed_node3 8238 1726882407.74972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882407.76951: done with get_vars() 8238 1726882407.76980: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:27 -0400 (0:00:00.052) 0:00:37.925 ****** 8238 1726882407.77066: entering _queue_task() for managed_node3/ping 8238 1726882407.77341: worker is 1 (out of 1 available) 8238 1726882407.77357: exiting _queue_task() for managed_node3/ping 8238 1726882407.77371: done queuing things up, now waiting for results queue to drain 8238 1726882407.77372: waiting for pending results... 8238 1726882407.77582: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 8238 1726882407.77688: in run() - task 0affc7ec-ae25-54bc-d334-000000000091 8238 1726882407.77703: variable 'ansible_search_path' from source: unknown 8238 1726882407.77707: variable 'ansible_search_path' from source: unknown 8238 1726882407.77739: calling self._execute() 8238 1726882407.77818: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.77821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.77834: variable 'omit' from source: magic vars 8238 1726882407.78130: variable 'ansible_distribution_major_version' from source: facts 8238 1726882407.78141: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882407.78147: variable 'omit' from source: magic vars 8238 1726882407.78200: variable 'omit' from source: magic vars 8238 1726882407.78269: variable 'omit' from source: magic vars 8238 1726882407.78273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882407.78295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882407.78310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882407.78327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882407.78336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882407.78363: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882407.78366: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.78371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.78450: Set connection var ansible_connection to ssh 8238 1726882407.78453: Set connection var ansible_shell_type to sh 8238 1726882407.78459: Set connection var ansible_pipelining to False 8238 1726882407.78461: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882407.78468: Set connection var ansible_timeout to 10 8238 1726882407.78475: Set connection var ansible_shell_executable to /bin/sh 8238 1726882407.78495: variable 'ansible_shell_executable' from source: unknown 8238 1726882407.78498: variable 'ansible_connection' from source: unknown 8238 1726882407.78501: variable 'ansible_module_compression' from source: unknown 8238 1726882407.78505: variable 'ansible_shell_type' from source: unknown 8238 1726882407.78507: variable 'ansible_shell_executable' from source: unknown 8238 1726882407.78509: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882407.78512: variable 'ansible_pipelining' from source: unknown 8238 1726882407.78514: variable 'ansible_timeout' from source: unknown 8238 1726882407.78598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882407.78874: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8238 1726882407.78878: variable 'omit' from source: magic vars 8238 1726882407.78881: starting attempt loop 8238 1726882407.78883: running the handler 8238 1726882407.78885: _low_level_execute_command(): starting 8238 1726882407.78888: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882407.79551: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.79604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882407.79647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882407.79712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882407.79820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882407.81574: stdout chunk (state=3): >>>/root <<< 8238 1726882407.81679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882407.81738: stderr chunk (state=3): >>><<< 8238 1726882407.81742: stdout chunk (state=3): >>><<< 8238 1726882407.81765: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882407.81778: _low_level_execute_command(): starting 8238 1726882407.81785: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205 `" && echo ansible-tmp-1726882407.8176665-9731-63027742398205="` echo /root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205 `" ) && sleep 0' 8238 1726882407.82258: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882407.82264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882407.82267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8238 1726882407.82269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882407.82281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.82318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882407.82328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882407.82414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882407.84385: stdout chunk (state=3): >>>ansible-tmp-1726882407.8176665-9731-63027742398205=/root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205 <<< 8238 1726882407.84504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882407.84560: stderr chunk (state=3): >>><<< 8238 1726882407.84563: stdout chunk (state=3): >>><<< 8238 1726882407.84576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882407.8176665-9731-63027742398205=/root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882407.84616: variable 'ansible_module_compression' from source: unknown 8238 1726882407.84657: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 8238 1726882407.84691: variable 'ansible_facts' from source: unknown 8238 1726882407.84746: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/AnsiballZ_ping.py 8238 1726882407.84852: Sending initial data 8238 1726882407.84855: Sent initial data (150 bytes) 8238 1726882407.85303: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882407.85346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882407.85349: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.85352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882407.85357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882407.85360: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.85396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882407.85399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882407.85488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882407.87112: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 8238 1726882407.87116: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882407.87192: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882407.87283: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp5lz60k92 /root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/AnsiballZ_ping.py <<< 8238 1726882407.87285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/AnsiballZ_ping.py" <<< 8238 1726882407.87359: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp5lz60k92" to remote "/root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/AnsiballZ_ping.py" <<< 8238 1726882407.88046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882407.88106: stderr chunk (state=3): >>><<< 8238 1726882407.88109: stdout chunk (state=3): >>><<< 8238 1726882407.88129: done transferring module to remote 8238 1726882407.88138: _low_level_execute_command(): starting 8238 1726882407.88143: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/ /root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/AnsiballZ_ping.py && sleep 0' 8238 1726882407.88579: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882407.88582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.88585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882407.88587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.88641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882407.88652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882407.88734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882407.90545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882407.90587: stderr chunk (state=3): >>><<< 8238 1726882407.90591: stdout chunk (state=3): >>><<< 8238 1726882407.90602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882407.90606: _low_level_execute_command(): starting 8238 1726882407.90609: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/AnsiballZ_ping.py && sleep 0' 8238 1726882407.91030: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882407.91034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.91036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882407.91038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882407.91040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882407.91087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882407.91091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882407.91185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.07612: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 8238 1726882408.08630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882408.08693: stderr chunk (state=3): >>><<< 8238 1726882408.08696: stdout chunk (state=3): >>><<< 8238 1726882408.08710: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882408.08736: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882408.08748: _low_level_execute_command(): starting 8238 1726882408.08752: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882407.8176665-9731-63027742398205/ > /dev/null 2>&1 && sleep 0' 8238 1726882408.09187: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882408.09192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882408.09194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.09246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.09253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.09338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.11325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.11328: stdout chunk (state=3): >>><<< 8238 1726882408.11331: stderr chunk (state=3): >>><<< 8238 1726882408.11529: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882408.11536: handler run complete 8238 1726882408.11539: attempt loop complete, returning result 8238 1726882408.11542: _execute() done 8238 1726882408.11544: dumping result to json 8238 1726882408.11546: done dumping result, returning 8238 1726882408.11549: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-54bc-d334-000000000091] 8238 1726882408.11551: sending task result for task 0affc7ec-ae25-54bc-d334-000000000091 ok: [managed_node3] => { "changed": false, "ping": "pong" } 8238 1726882408.11708: no more pending results, returning what we have 8238 1726882408.11712: results queue empty 8238 1726882408.11713: checking for any_errors_fatal 8238 1726882408.11721: done checking for any_errors_fatal 8238 1726882408.11725: checking for max_fail_percentage 8238 1726882408.11726: done checking for max_fail_percentage 8238 1726882408.11727: checking to see if all hosts have failed and the running result is not ok 8238 1726882408.11728: done checking to see if all hosts have failed 8238 1726882408.11729: getting the remaining hosts for this loop 8238 1726882408.11730: done getting the remaining hosts for this loop 8238 1726882408.11734: getting the next task for host managed_node3 8238 1726882408.11746: done getting next task for host managed_node3 8238 1726882408.11748: ^ task is: TASK: meta (role_complete) 8238 1726882408.11752: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882408.11763: getting variables 8238 1726882408.11764: in VariableManager get_vars() 8238 1726882408.11811: Calling all_inventory to load vars for managed_node3 8238 1726882408.11814: Calling groups_inventory to load vars for managed_node3 8238 1726882408.11816: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882408.11834: Calling all_plugins_play to load vars for managed_node3 8238 1726882408.11838: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882408.11843: done sending task result for task 0affc7ec-ae25-54bc-d334-000000000091 8238 1726882408.11845: WORKER PROCESS EXITING 8238 1726882408.11849: Calling groups_plugins_play to load vars for managed_node3 8238 1726882408.12830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882408.14485: done with get_vars() 8238 1726882408.14514: done getting variables 8238 1726882408.14674: done queuing things up, now waiting for results queue to drain 8238 1726882408.14676: results queue empty 8238 1726882408.14678: checking for any_errors_fatal 8238 1726882408.14680: done checking for any_errors_fatal 8238 1726882408.14681: checking for max_fail_percentage 8238 1726882408.14682: done checking for max_fail_percentage 8238 1726882408.14683: checking to see if all hosts have failed and the running result is not ok 8238 1726882408.14684: done checking to see if all hosts have failed 8238 1726882408.14685: getting the remaining hosts for this loop 8238 1726882408.14686: done getting the remaining hosts for this loop 8238 1726882408.14688: getting the next task for host managed_node3 8238 1726882408.14693: done getting next task for host managed_node3 8238 1726882408.14695: ^ task is: TASK: Delete the device '{{ controller_device }}' 8238 1726882408.14702: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882408.14705: getting variables 8238 1726882408.14706: in VariableManager get_vars() 8238 1726882408.14728: Calling all_inventory to load vars for managed_node3 8238 1726882408.14730: Calling groups_inventory to load vars for managed_node3 8238 1726882408.14732: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882408.14738: Calling all_plugins_play to load vars for managed_node3 8238 1726882408.14741: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882408.14744: Calling groups_plugins_play to load vars for managed_node3 8238 1726882408.16444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882408.18601: done with get_vars() 8238 1726882408.18635: done getting variables 8238 1726882408.18693: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8238 1726882408.18828: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Friday 20 September 2024 21:33:28 -0400 (0:00:00.417) 0:00:38.343 ****** 8238 1726882408.18862: entering _queue_task() for managed_node3/command 8238 1726882408.19265: worker is 1 (out of 1 available) 8238 1726882408.19278: exiting _queue_task() for managed_node3/command 8238 1726882408.19293: done queuing things up, now waiting for results queue to drain 8238 1726882408.19295: waiting for pending results... 8238 1726882408.19591: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 8238 1726882408.19710: in run() - task 0affc7ec-ae25-54bc-d334-0000000000c1 8238 1726882408.19735: variable 'ansible_search_path' from source: unknown 8238 1726882408.19781: calling self._execute() 8238 1726882408.19953: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882408.19970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882408.19989: variable 'omit' from source: magic vars 8238 1726882408.20357: variable 'ansible_distribution_major_version' from source: facts 8238 1726882408.20370: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882408.20375: variable 'omit' from source: magic vars 8238 1726882408.20394: variable 'omit' from source: magic vars 8238 1726882408.20473: variable 'controller_device' from source: play vars 8238 1726882408.20488: variable 'omit' from source: magic vars 8238 1726882408.20525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882408.20556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882408.20573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882408.20588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882408.20598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882408.20727: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882408.20732: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882408.20735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882408.20737: Set connection var ansible_connection to ssh 8238 1726882408.20739: Set connection var ansible_shell_type to sh 8238 1726882408.20741: Set connection var ansible_pipelining to False 8238 1726882408.20743: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882408.20746: Set connection var ansible_timeout to 10 8238 1726882408.20748: Set connection var ansible_shell_executable to /bin/sh 8238 1726882408.20750: variable 'ansible_shell_executable' from source: unknown 8238 1726882408.20752: variable 'ansible_connection' from source: unknown 8238 1726882408.20755: variable 'ansible_module_compression' from source: unknown 8238 1726882408.20765: variable 'ansible_shell_type' from source: unknown 8238 1726882408.20768: variable 'ansible_shell_executable' from source: unknown 8238 1726882408.20770: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882408.20772: variable 'ansible_pipelining' from source: unknown 8238 1726882408.20774: variable 'ansible_timeout' from source: unknown 8238 1726882408.20777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882408.20895: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882408.20905: variable 'omit' from source: magic vars 8238 1726882408.20910: starting attempt loop 8238 1726882408.20913: running the handler 8238 1726882408.20929: _low_level_execute_command(): starting 8238 1726882408.20936: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882408.21472: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882408.21476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.21479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882408.21481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.21529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882408.21535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.21551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.21629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.23278: stdout chunk (state=3): >>>/root <<< 8238 1726882408.23394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.23447: stderr chunk (state=3): >>><<< 8238 1726882408.23451: stdout chunk (state=3): >>><<< 8238 1726882408.23473: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882408.23484: _low_level_execute_command(): starting 8238 1726882408.23490: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623 `" && echo ansible-tmp-1726882408.2347293-9749-148276919371623="` echo /root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623 `" ) && sleep 0' 8238 1726882408.23917: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882408.23921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.23945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.24000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882408.24005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.24007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.24089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.26065: stdout chunk (state=3): >>>ansible-tmp-1726882408.2347293-9749-148276919371623=/root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623 <<< 8238 1726882408.26333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.26336: stdout chunk (state=3): >>><<< 8238 1726882408.26339: stderr chunk (state=3): >>><<< 8238 1726882408.26341: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882408.2347293-9749-148276919371623=/root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882408.26345: variable 'ansible_module_compression' from source: unknown 8238 1726882408.26397: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882408.26442: variable 'ansible_facts' from source: unknown 8238 1726882408.26545: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/AnsiballZ_command.py 8238 1726882408.26794: Sending initial data 8238 1726882408.26798: Sent initial data (154 bytes) 8238 1726882408.27465: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 8238 1726882408.27546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.27588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.27699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.29269: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882408.29377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882408.29477: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpkte3x1vy /root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/AnsiballZ_command.py <<< 8238 1726882408.29481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/AnsiballZ_command.py" <<< 8238 1726882408.29568: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpkte3x1vy" to remote "/root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/AnsiballZ_command.py" <<< 8238 1726882408.30634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.30637: stdout chunk (state=3): >>><<< 8238 1726882408.30640: stderr chunk (state=3): >>><<< 8238 1726882408.30642: done transferring module to remote 8238 1726882408.30644: _low_level_execute_command(): starting 8238 1726882408.30647: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/ /root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/AnsiballZ_command.py && sleep 0' 8238 1726882408.31332: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882408.31345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882408.31363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882408.31391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882408.31438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 8238 1726882408.31451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882408.31503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.31567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882408.31610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.31641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.31735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.33616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.33619: stdout chunk (state=3): >>><<< 8238 1726882408.33806: stderr chunk (state=3): >>><<< 8238 1726882408.33812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882408.33829: _low_level_execute_command(): starting 8238 1726882408.33832: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/AnsiballZ_command.py && sleep 0' 8238 1726882408.34621: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882408.34632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882408.34643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882408.34736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.34747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882408.34766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.34779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.34900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.52065: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:33:28.511199", "end": "2024-09-20 21:33:28.518893", "delta": "0:00:00.007694", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882408.53471: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.226 closed. <<< 8238 1726882408.53568: stderr chunk (state=3): >>><<< 8238 1726882408.53575: stdout chunk (state=3): >>><<< 8238 1726882408.53607: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:33:28.511199", "end": "2024-09-20 21:33:28.518893", "delta": "0:00:00.007694", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.226 closed. 8238 1726882408.53687: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882408.53691: _low_level_execute_command(): starting 8238 1726882408.53694: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882408.2347293-9749-148276919371623/ > /dev/null 2>&1 && sleep 0' 8238 1726882408.54360: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882408.54365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882408.54376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882408.54378: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.54439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882408.54443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.54467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.54545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.56462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.56505: stderr chunk (state=3): >>><<< 8238 1726882408.56509: stdout chunk (state=3): >>><<< 8238 1726882408.56524: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882408.56531: handler run complete 8238 1726882408.56549: Evaluated conditional (False): False 8238 1726882408.56552: Evaluated conditional (False): False 8238 1726882408.56563: attempt loop complete, returning result 8238 1726882408.56566: _execute() done 8238 1726882408.56568: dumping result to json 8238 1726882408.56578: done dumping result, returning 8238 1726882408.56584: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [0affc7ec-ae25-54bc-d334-0000000000c1] 8238 1726882408.56588: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c1 8238 1726882408.56690: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c1 8238 1726882408.56694: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007694", "end": "2024-09-20 21:33:28.518893", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:33:28.511199" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 8238 1726882408.56772: no more pending results, returning what we have 8238 1726882408.56776: results queue empty 8238 1726882408.56777: checking for any_errors_fatal 8238 1726882408.56779: done checking for any_errors_fatal 8238 1726882408.56779: checking for max_fail_percentage 8238 1726882408.56781: done checking for max_fail_percentage 8238 1726882408.56782: checking to see if all hosts have failed and the running result is not ok 8238 1726882408.56783: done checking to see if all hosts have failed 8238 1726882408.56784: getting the remaining hosts for this loop 8238 1726882408.56786: done getting the remaining hosts for this loop 8238 1726882408.56790: getting the next task for host managed_node3 8238 1726882408.56797: done getting next task for host managed_node3 8238 1726882408.56800: ^ task is: TASK: Remove test interfaces 8238 1726882408.56806: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882408.56812: getting variables 8238 1726882408.56813: in VariableManager get_vars() 8238 1726882408.56854: Calling all_inventory to load vars for managed_node3 8238 1726882408.56859: Calling groups_inventory to load vars for managed_node3 8238 1726882408.56861: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882408.56872: Calling all_plugins_play to load vars for managed_node3 8238 1726882408.56875: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882408.56878: Calling groups_plugins_play to load vars for managed_node3 8238 1726882408.62057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882408.63583: done with get_vars() 8238 1726882408.63603: done getting variables 8238 1726882408.63644: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:28 -0400 (0:00:00.448) 0:00:38.791 ****** 8238 1726882408.63665: entering _queue_task() for managed_node3/shell 8238 1726882408.64335: worker is 1 (out of 1 available) 8238 1726882408.64350: exiting _queue_task() for managed_node3/shell 8238 1726882408.64362: done queuing things up, now waiting for results queue to drain 8238 1726882408.64365: waiting for pending results... 8238 1726882408.65196: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 8238 1726882408.65637: in run() - task 0affc7ec-ae25-54bc-d334-0000000000c5 8238 1726882408.65641: variable 'ansible_search_path' from source: unknown 8238 1726882408.65644: variable 'ansible_search_path' from source: unknown 8238 1726882408.66042: calling self._execute() 8238 1726882408.66046: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882408.66049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882408.66052: variable 'omit' from source: magic vars 8238 1726882408.66782: variable 'ansible_distribution_major_version' from source: facts 8238 1726882408.66803: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882408.66817: variable 'omit' from source: magic vars 8238 1726882408.66892: variable 'omit' from source: magic vars 8238 1726882408.67083: variable 'dhcp_interface1' from source: play vars 8238 1726882408.67097: variable 'dhcp_interface2' from source: play vars 8238 1726882408.67127: variable 'omit' from source: magic vars 8238 1726882408.67183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882408.67235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882408.67265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882408.67291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882408.67309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882408.67351: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882408.67365: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882408.67376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882408.67498: Set connection var ansible_connection to ssh 8238 1726882408.67508: Set connection var ansible_shell_type to sh 8238 1726882408.67523: Set connection var ansible_pipelining to False 8238 1726882408.67536: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882408.67547: Set connection var ansible_timeout to 10 8238 1726882408.67564: Set connection var ansible_shell_executable to /bin/sh 8238 1726882408.67591: variable 'ansible_shell_executable' from source: unknown 8238 1726882408.67599: variable 'ansible_connection' from source: unknown 8238 1726882408.67606: variable 'ansible_module_compression' from source: unknown 8238 1726882408.67615: variable 'ansible_shell_type' from source: unknown 8238 1726882408.67632: variable 'ansible_shell_executable' from source: unknown 8238 1726882408.67888: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882408.67892: variable 'ansible_pipelining' from source: unknown 8238 1726882408.67896: variable 'ansible_timeout' from source: unknown 8238 1726882408.67926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882408.68377: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882408.68394: variable 'omit' from source: magic vars 8238 1726882408.68400: starting attempt loop 8238 1726882408.68403: running the handler 8238 1726882408.68415: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882408.68640: _low_level_execute_command(): starting 8238 1726882408.68644: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882408.70264: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882408.70282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882408.70299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882408.70320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882408.70341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882408.70354: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882408.70440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.70467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882408.70483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.70506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.70629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.72421: stdout chunk (state=3): >>>/root <<< 8238 1726882408.72831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.72835: stdout chunk (state=3): >>><<< 8238 1726882408.72837: stderr chunk (state=3): >>><<< 8238 1726882408.72840: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882408.72844: _low_level_execute_command(): starting 8238 1726882408.72847: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830 `" && echo ansible-tmp-1726882408.727367-9772-142517467026830="` echo /root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830 `" ) && sleep 0' 8238 1726882408.73548: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882408.73561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882408.73637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.73677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882408.73692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.73737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.73869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.75827: stdout chunk (state=3): >>>ansible-tmp-1726882408.727367-9772-142517467026830=/root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830 <<< 8238 1726882408.76003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.76014: stdout chunk (state=3): >>><<< 8238 1726882408.76031: stderr chunk (state=3): >>><<< 8238 1726882408.76058: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882408.727367-9772-142517467026830=/root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882408.76100: variable 'ansible_module_compression' from source: unknown 8238 1726882408.76167: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882408.76210: variable 'ansible_facts' from source: unknown 8238 1726882408.76308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/AnsiballZ_command.py 8238 1726882408.76500: Sending initial data 8238 1726882408.76503: Sent initial data (153 bytes) 8238 1726882408.77241: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.77277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882408.77297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.77311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.77473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.79046: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8238 1726882408.79067: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882408.79181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882408.79285: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp1_ar5aod /root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/AnsiballZ_command.py <<< 8238 1726882408.79289: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/AnsiballZ_command.py" <<< 8238 1726882408.79369: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmp1_ar5aod" to remote "/root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/AnsiballZ_command.py" <<< 8238 1726882408.80508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.80539: stderr chunk (state=3): >>><<< 8238 1726882408.80724: stdout chunk (state=3): >>><<< 8238 1726882408.80728: done transferring module to remote 8238 1726882408.80734: _low_level_execute_command(): starting 8238 1726882408.81028: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/ /root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/AnsiballZ_command.py && sleep 0' 8238 1726882408.81402: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882408.81405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882408.81408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.81414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882408.81416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.81738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.81772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882408.83686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882408.83696: stdout chunk (state=3): >>><<< 8238 1726882408.83708: stderr chunk (state=3): >>><<< 8238 1726882408.83737: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882408.83752: _low_level_execute_command(): starting 8238 1726882408.83765: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/AnsiballZ_command.py && sleep 0' 8238 1726882408.84888: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882408.84901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882408.84911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882408.84969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882408.85115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882408.85207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.05707: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:33:29.015413", "end": "2024-09-20 21:33:29.053727", "delta": "0:00:00.038314", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882409.07529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882409.07533: stdout chunk (state=3): >>><<< 8238 1726882409.07536: stderr chunk (state=3): >>><<< 8238 1726882409.07540: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:33:29.015413", "end": "2024-09-20 21:33:29.053727", "delta": "0:00:00.038314", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882409.07546: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882409.07549: _low_level_execute_command(): starting 8238 1726882409.07551: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882408.727367-9772-142517467026830/ > /dev/null 2>&1 && sleep 0' 8238 1726882409.08204: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882409.08208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882409.08211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882409.08213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882409.08216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882409.08219: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882409.08229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882409.08261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882409.08303: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882409.08364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882409.08377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882409.08394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.08552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.10537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882409.10573: stderr chunk (state=3): >>><<< 8238 1726882409.10586: stdout chunk (state=3): >>><<< 8238 1726882409.10610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882409.10625: handler run complete 8238 1726882409.10664: Evaluated conditional (False): False 8238 1726882409.10680: attempt loop complete, returning result 8238 1726882409.10687: _execute() done 8238 1726882409.10695: dumping result to json 8238 1726882409.10704: done dumping result, returning 8238 1726882409.10716: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [0affc7ec-ae25-54bc-d334-0000000000c5] 8238 1726882409.10732: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c5 ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.038314", "end": "2024-09-20 21:33:29.053727", "rc": 0, "start": "2024-09-20 21:33:29.015413" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 8238 1726882409.10946: no more pending results, returning what we have 8238 1726882409.10951: results queue empty 8238 1726882409.10952: checking for any_errors_fatal 8238 1726882409.10964: done checking for any_errors_fatal 8238 1726882409.10965: checking for max_fail_percentage 8238 1726882409.10966: done checking for max_fail_percentage 8238 1726882409.10968: checking to see if all hosts have failed and the running result is not ok 8238 1726882409.10968: done checking to see if all hosts have failed 8238 1726882409.10969: getting the remaining hosts for this loop 8238 1726882409.10971: done getting the remaining hosts for this loop 8238 1726882409.10975: getting the next task for host managed_node3 8238 1726882409.10984: done getting next task for host managed_node3 8238 1726882409.10987: ^ task is: TASK: Stop dnsmasq/radvd services 8238 1726882409.10991: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882409.10997: getting variables 8238 1726882409.10999: in VariableManager get_vars() 8238 1726882409.11262: Calling all_inventory to load vars for managed_node3 8238 1726882409.11266: Calling groups_inventory to load vars for managed_node3 8238 1726882409.11268: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882409.11275: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c5 8238 1726882409.11285: WORKER PROCESS EXITING 8238 1726882409.11297: Calling all_plugins_play to load vars for managed_node3 8238 1726882409.11300: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882409.11304: Calling groups_plugins_play to load vars for managed_node3 8238 1726882409.13155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882409.15967: done with get_vars() 8238 1726882409.16043: done getting variables 8238 1726882409.16174: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:33:29 -0400 (0:00:00.525) 0:00:39.317 ****** 8238 1726882409.16240: entering _queue_task() for managed_node3/shell 8238 1726882409.16641: worker is 1 (out of 1 available) 8238 1726882409.16656: exiting _queue_task() for managed_node3/shell 8238 1726882409.16669: done queuing things up, now waiting for results queue to drain 8238 1726882409.16670: waiting for pending results... 8238 1726882409.16938: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 8238 1726882409.17119: in run() - task 0affc7ec-ae25-54bc-d334-0000000000c6 8238 1726882409.17143: variable 'ansible_search_path' from source: unknown 8238 1726882409.17165: variable 'ansible_search_path' from source: unknown 8238 1726882409.17207: calling self._execute() 8238 1726882409.17427: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882409.17431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882409.17434: variable 'omit' from source: magic vars 8238 1726882409.17782: variable 'ansible_distribution_major_version' from source: facts 8238 1726882409.17805: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882409.17816: variable 'omit' from source: magic vars 8238 1726882409.17883: variable 'omit' from source: magic vars 8238 1726882409.17945: variable 'omit' from source: magic vars 8238 1726882409.17995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882409.18149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882409.18174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882409.18197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882409.18214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882409.18256: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882409.18428: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882409.18432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882409.18667: Set connection var ansible_connection to ssh 8238 1726882409.18670: Set connection var ansible_shell_type to sh 8238 1726882409.18672: Set connection var ansible_pipelining to False 8238 1726882409.18674: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882409.18676: Set connection var ansible_timeout to 10 8238 1726882409.18677: Set connection var ansible_shell_executable to /bin/sh 8238 1726882409.18679: variable 'ansible_shell_executable' from source: unknown 8238 1726882409.18681: variable 'ansible_connection' from source: unknown 8238 1726882409.18683: variable 'ansible_module_compression' from source: unknown 8238 1726882409.18685: variable 'ansible_shell_type' from source: unknown 8238 1726882409.18688: variable 'ansible_shell_executable' from source: unknown 8238 1726882409.18690: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882409.18691: variable 'ansible_pipelining' from source: unknown 8238 1726882409.18695: variable 'ansible_timeout' from source: unknown 8238 1726882409.18697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882409.19104: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882409.19109: variable 'omit' from source: magic vars 8238 1726882409.19112: starting attempt loop 8238 1726882409.19114: running the handler 8238 1726882409.19117: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882409.19119: _low_level_execute_command(): starting 8238 1726882409.19124: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882409.20201: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.20307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.22530: stdout chunk (state=3): >>>/root <<< 8238 1726882409.22533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882409.22536: stdout chunk (state=3): >>><<< 8238 1726882409.22538: stderr chunk (state=3): >>><<< 8238 1726882409.22542: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882409.22544: _low_level_execute_command(): starting 8238 1726882409.22547: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778 `" && echo ansible-tmp-1726882409.2247427-9810-187720624851778="` echo /root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778 `" ) && sleep 0' 8238 1726882409.23878: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882409.23882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882409.23902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882409.23918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.24039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.26026: stdout chunk (state=3): >>>ansible-tmp-1726882409.2247427-9810-187720624851778=/root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778 <<< 8238 1726882409.26235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882409.26243: stderr chunk (state=3): >>><<< 8238 1726882409.26311: stdout chunk (state=3): >>><<< 8238 1726882409.26315: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882409.2247427-9810-187720624851778=/root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882409.26604: variable 'ansible_module_compression' from source: unknown 8238 1726882409.26608: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882409.26610: variable 'ansible_facts' from source: unknown 8238 1726882409.26681: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/AnsiballZ_command.py 8238 1726882409.26858: Sending initial data 8238 1726882409.26865: Sent initial data (154 bytes) 8238 1726882409.27850: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882409.27857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882409.27931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882409.27957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882409.27967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.28075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.29854: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882409.29976: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpzawhdm7o /root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/AnsiballZ_command.py <<< 8238 1726882409.29980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/AnsiballZ_command.py" <<< 8238 1726882409.30055: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmpzawhdm7o" to remote "/root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/AnsiballZ_command.py" <<< 8238 1726882409.30067: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/AnsiballZ_command.py" <<< 8238 1726882409.31258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882409.31262: stderr chunk (state=3): >>><<< 8238 1726882409.31264: stdout chunk (state=3): >>><<< 8238 1726882409.31266: done transferring module to remote 8238 1726882409.31269: _low_level_execute_command(): starting 8238 1726882409.31271: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/ /root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/AnsiballZ_command.py && sleep 0' 8238 1726882409.31796: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882409.31812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882409.31832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882409.31937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882409.31963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.32077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.33898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882409.33984: stderr chunk (state=3): >>><<< 8238 1726882409.33999: stdout chunk (state=3): >>><<< 8238 1726882409.34021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882409.34037: _low_level_execute_command(): starting 8238 1726882409.34046: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/AnsiballZ_command.py && sleep 0' 8238 1726882409.34674: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882409.34688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882409.34704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882409.34725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882409.34744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882409.34768: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882409.34843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882409.34887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882409.34908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882409.34932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.35054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.54895: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:33:29.516466", "end": "2024-09-20 21:33:29.545657", "delta": "0:00:00.029191", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882409.56672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 8238 1726882409.56676: stdout chunk (state=3): >>><<< 8238 1726882409.56685: stderr chunk (state=3): >>><<< 8238 1726882409.56762: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:33:29.516466", "end": "2024-09-20 21:33:29.545657", "delta": "0:00:00.029191", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882409.56833: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882409.56842: _low_level_execute_command(): starting 8238 1726882409.56844: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882409.2247427-9810-187720624851778/ > /dev/null 2>&1 && sleep 0' 8238 1726882409.57762: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882409.57804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882409.57808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882409.57811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882409.57813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882409.57816: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882409.57821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882409.57911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882409.57915: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882409.57917: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8238 1726882409.57919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882409.57921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882409.57926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882409.57928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882409.57930: stderr chunk (state=3): >>>debug2: match found <<< 8238 1726882409.57932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882409.57973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882409.57993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882409.58005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.58125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.60328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882409.60333: stderr chunk (state=3): >>><<< 8238 1726882409.60335: stdout chunk (state=3): >>><<< 8238 1726882409.60338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882409.60341: handler run complete 8238 1726882409.60343: Evaluated conditional (False): False 8238 1726882409.60351: attempt loop complete, returning result 8238 1726882409.60354: _execute() done 8238 1726882409.60359: dumping result to json 8238 1726882409.60364: done dumping result, returning 8238 1726882409.60372: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [0affc7ec-ae25-54bc-d334-0000000000c6] 8238 1726882409.60379: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c6 ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.029191", "end": "2024-09-20 21:33:29.545657", "rc": 0, "start": "2024-09-20 21:33:29.516466" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 8238 1726882409.60592: no more pending results, returning what we have 8238 1726882409.60595: results queue empty 8238 1726882409.60597: checking for any_errors_fatal 8238 1726882409.60607: done checking for any_errors_fatal 8238 1726882409.60608: checking for max_fail_percentage 8238 1726882409.60609: done checking for max_fail_percentage 8238 1726882409.60610: checking to see if all hosts have failed and the running result is not ok 8238 1726882409.60611: done checking to see if all hosts have failed 8238 1726882409.60612: getting the remaining hosts for this loop 8238 1726882409.60614: done getting the remaining hosts for this loop 8238 1726882409.60619: getting the next task for host managed_node3 8238 1726882409.60637: done getting next task for host managed_node3 8238 1726882409.60640: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 8238 1726882409.60645: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882409.60649: getting variables 8238 1726882409.60651: in VariableManager get_vars() 8238 1726882409.60695: Calling all_inventory to load vars for managed_node3 8238 1726882409.60698: Calling groups_inventory to load vars for managed_node3 8238 1726882409.60700: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882409.60971: Calling all_plugins_play to load vars for managed_node3 8238 1726882409.60976: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882409.60980: Calling groups_plugins_play to load vars for managed_node3 8238 1726882409.61641: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c6 8238 1726882409.61645: WORKER PROCESS EXITING 8238 1726882409.62958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882409.65563: done with get_vars() 8238 1726882409.65599: done getting variables 8238 1726882409.65704: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Friday 20 September 2024 21:33:29 -0400 (0:00:00.495) 0:00:39.812 ****** 8238 1726882409.65787: entering _queue_task() for managed_node3/command 8238 1726882409.66424: worker is 1 (out of 1 available) 8238 1726882409.66436: exiting _queue_task() for managed_node3/command 8238 1726882409.66446: done queuing things up, now waiting for results queue to drain 8238 1726882409.66447: waiting for pending results... 8238 1726882409.66680: running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript 8238 1726882409.66826: in run() - task 0affc7ec-ae25-54bc-d334-0000000000c7 8238 1726882409.66849: variable 'ansible_search_path' from source: unknown 8238 1726882409.66897: calling self._execute() 8238 1726882409.67019: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882409.67035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882409.67050: variable 'omit' from source: magic vars 8238 1726882409.67472: variable 'ansible_distribution_major_version' from source: facts 8238 1726882409.67492: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882409.67643: variable 'network_provider' from source: set_fact 8238 1726882409.67657: Evaluated conditional (network_provider == "initscripts"): False 8238 1726882409.67828: when evaluation is False, skipping this task 8238 1726882409.67832: _execute() done 8238 1726882409.67835: dumping result to json 8238 1726882409.67837: done dumping result, returning 8238 1726882409.67840: done running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript [0affc7ec-ae25-54bc-d334-0000000000c7] 8238 1726882409.67842: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c7 8238 1726882409.67919: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c7 8238 1726882409.67925: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8238 1726882409.67982: no more pending results, returning what we have 8238 1726882409.67987: results queue empty 8238 1726882409.67988: checking for any_errors_fatal 8238 1726882409.68001: done checking for any_errors_fatal 8238 1726882409.68003: checking for max_fail_percentage 8238 1726882409.68004: done checking for max_fail_percentage 8238 1726882409.68005: checking to see if all hosts have failed and the running result is not ok 8238 1726882409.68006: done checking to see if all hosts have failed 8238 1726882409.68007: getting the remaining hosts for this loop 8238 1726882409.68010: done getting the remaining hosts for this loop 8238 1726882409.68015: getting the next task for host managed_node3 8238 1726882409.68028: done getting next task for host managed_node3 8238 1726882409.68031: ^ task is: TASK: Verify network state restored to default 8238 1726882409.68036: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882409.68041: getting variables 8238 1726882409.68042: in VariableManager get_vars() 8238 1726882409.68091: Calling all_inventory to load vars for managed_node3 8238 1726882409.68095: Calling groups_inventory to load vars for managed_node3 8238 1726882409.68098: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882409.68113: Calling all_plugins_play to load vars for managed_node3 8238 1726882409.68117: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882409.68121: Calling groups_plugins_play to load vars for managed_node3 8238 1726882409.70732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882409.72840: done with get_vars() 8238 1726882409.72874: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Friday 20 September 2024 21:33:29 -0400 (0:00:00.071) 0:00:39.884 ****** 8238 1726882409.72980: entering _queue_task() for managed_node3/include_tasks 8238 1726882409.73599: worker is 1 (out of 1 available) 8238 1726882409.73613: exiting _queue_task() for managed_node3/include_tasks 8238 1726882409.73632: done queuing things up, now waiting for results queue to drain 8238 1726882409.73633: waiting for pending results... 8238 1726882409.74046: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 8238 1726882409.74098: in run() - task 0affc7ec-ae25-54bc-d334-0000000000c8 8238 1726882409.74124: variable 'ansible_search_path' from source: unknown 8238 1726882409.74175: calling self._execute() 8238 1726882409.74299: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882409.74314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882409.74334: variable 'omit' from source: magic vars 8238 1726882409.74782: variable 'ansible_distribution_major_version' from source: facts 8238 1726882409.74805: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882409.74818: _execute() done 8238 1726882409.74829: dumping result to json 8238 1726882409.74838: done dumping result, returning 8238 1726882409.74850: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0affc7ec-ae25-54bc-d334-0000000000c8] 8238 1726882409.74866: sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c8 8238 1726882409.75157: no more pending results, returning what we have 8238 1726882409.75164: in VariableManager get_vars() 8238 1726882409.75217: Calling all_inventory to load vars for managed_node3 8238 1726882409.75220: Calling groups_inventory to load vars for managed_node3 8238 1726882409.75225: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882409.75242: Calling all_plugins_play to load vars for managed_node3 8238 1726882409.75246: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882409.75250: Calling groups_plugins_play to load vars for managed_node3 8238 1726882409.75841: done sending task result for task 0affc7ec-ae25-54bc-d334-0000000000c8 8238 1726882409.75845: WORKER PROCESS EXITING 8238 1726882409.77210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882409.80111: done with get_vars() 8238 1726882409.80136: variable 'ansible_search_path' from source: unknown 8238 1726882409.80153: we have included files to process 8238 1726882409.80157: generating all_blocks data 8238 1726882409.80163: done generating all_blocks data 8238 1726882409.80169: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 8238 1726882409.80170: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 8238 1726882409.80173: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 8238 1726882409.80628: done processing included file 8238 1726882409.80631: iterating over new_blocks loaded from include file 8238 1726882409.80632: in VariableManager get_vars() 8238 1726882409.80652: done with get_vars() 8238 1726882409.80654: filtering new block on tags 8238 1726882409.80696: done filtering new block on tags 8238 1726882409.80699: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 8238 1726882409.80704: extending task lists for all hosts with included blocks 8238 1726882409.82072: done extending task lists 8238 1726882409.82074: done processing included files 8238 1726882409.82075: results queue empty 8238 1726882409.82076: checking for any_errors_fatal 8238 1726882409.82079: done checking for any_errors_fatal 8238 1726882409.82080: checking for max_fail_percentage 8238 1726882409.82081: done checking for max_fail_percentage 8238 1726882409.82082: checking to see if all hosts have failed and the running result is not ok 8238 1726882409.82083: done checking to see if all hosts have failed 8238 1726882409.82083: getting the remaining hosts for this loop 8238 1726882409.82085: done getting the remaining hosts for this loop 8238 1726882409.82087: getting the next task for host managed_node3 8238 1726882409.82091: done getting next task for host managed_node3 8238 1726882409.82094: ^ task is: TASK: Check routes and DNS 8238 1726882409.82097: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882409.82100: getting variables 8238 1726882409.82101: in VariableManager get_vars() 8238 1726882409.82114: Calling all_inventory to load vars for managed_node3 8238 1726882409.82117: Calling groups_inventory to load vars for managed_node3 8238 1726882409.82119: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882409.82127: Calling all_plugins_play to load vars for managed_node3 8238 1726882409.82130: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882409.82133: Calling groups_plugins_play to load vars for managed_node3 8238 1726882409.83579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882409.85624: done with get_vars() 8238 1726882409.85650: done getting variables 8238 1726882409.85700: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:33:29 -0400 (0:00:00.127) 0:00:40.012 ****** 8238 1726882409.85736: entering _queue_task() for managed_node3/shell 8238 1726882409.86111: worker is 1 (out of 1 available) 8238 1726882409.86127: exiting _queue_task() for managed_node3/shell 8238 1726882409.86141: done queuing things up, now waiting for results queue to drain 8238 1726882409.86143: waiting for pending results... 8238 1726882409.86411: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 8238 1726882409.86562: in run() - task 0affc7ec-ae25-54bc-d334-00000000056d 8238 1726882409.86576: variable 'ansible_search_path' from source: unknown 8238 1726882409.86579: variable 'ansible_search_path' from source: unknown 8238 1726882409.86661: calling self._execute() 8238 1726882409.86733: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882409.86739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882409.86757: variable 'omit' from source: magic vars 8238 1726882409.87175: variable 'ansible_distribution_major_version' from source: facts 8238 1726882409.87193: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882409.87199: variable 'omit' from source: magic vars 8238 1726882409.87253: variable 'omit' from source: magic vars 8238 1726882409.87296: variable 'omit' from source: magic vars 8238 1726882409.87379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8238 1726882409.87383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8238 1726882409.87390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8238 1726882409.87418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882409.87430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8238 1726882409.87464: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8238 1726882409.87468: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882409.87471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882409.87596: Set connection var ansible_connection to ssh 8238 1726882409.87599: Set connection var ansible_shell_type to sh 8238 1726882409.87601: Set connection var ansible_pipelining to False 8238 1726882409.87604: Set connection var ansible_module_compression to ZIP_DEFLATED 8238 1726882409.87606: Set connection var ansible_timeout to 10 8238 1726882409.87703: Set connection var ansible_shell_executable to /bin/sh 8238 1726882409.87706: variable 'ansible_shell_executable' from source: unknown 8238 1726882409.87709: variable 'ansible_connection' from source: unknown 8238 1726882409.87712: variable 'ansible_module_compression' from source: unknown 8238 1726882409.87714: variable 'ansible_shell_type' from source: unknown 8238 1726882409.87716: variable 'ansible_shell_executable' from source: unknown 8238 1726882409.87719: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882409.87721: variable 'ansible_pipelining' from source: unknown 8238 1726882409.87725: variable 'ansible_timeout' from source: unknown 8238 1726882409.87727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882409.87812: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882409.87825: variable 'omit' from source: magic vars 8238 1726882409.87831: starting attempt loop 8238 1726882409.87834: running the handler 8238 1726882409.87928: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8238 1726882409.87933: _low_level_execute_command(): starting 8238 1726882409.87935: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8238 1726882409.89608: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882409.89716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.89814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.91583: stdout chunk (state=3): >>>/root <<< 8238 1726882409.91765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882409.91791: stdout chunk (state=3): >>><<< 8238 1726882409.91811: stderr chunk (state=3): >>><<< 8238 1726882409.91933: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882409.91937: _low_level_execute_command(): starting 8238 1726882409.91941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868 `" && echo ansible-tmp-1726882409.918353-9852-188076930839868="` echo /root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868 `" ) && sleep 0' 8238 1726882409.92967: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882409.92971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882409.92974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882409.92983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882409.93071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882409.93242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.93505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.95473: stdout chunk (state=3): >>>ansible-tmp-1726882409.918353-9852-188076930839868=/root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868 <<< 8238 1726882409.95745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882409.95748: stdout chunk (state=3): >>><<< 8238 1726882409.95751: stderr chunk (state=3): >>><<< 8238 1726882409.95771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882409.918353-9852-188076930839868=/root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882409.95927: variable 'ansible_module_compression' from source: unknown 8238 1726882409.95930: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82389jlm8v9k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8238 1726882409.95933: variable 'ansible_facts' from source: unknown 8238 1726882409.96015: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/AnsiballZ_command.py 8238 1726882409.96181: Sending initial data 8238 1726882409.96280: Sent initial data (153 bytes) 8238 1726882409.96908: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882409.97069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882409.97093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882409.97199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882409.98820: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8238 1726882409.98837: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 8238 1726882409.98861: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8238 1726882409.98963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8238 1726882409.99088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82389jlm8v9k/tmprfwgfig2 /root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/AnsiballZ_command.py <<< 8238 1726882409.99091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/AnsiballZ_command.py" <<< 8238 1726882409.99166: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-82389jlm8v9k/tmprfwgfig2" to remote "/root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/AnsiballZ_command.py" <<< 8238 1726882410.00699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882410.00702: stderr chunk (state=3): >>><<< 8238 1726882410.00737: stdout chunk (state=3): >>><<< 8238 1726882410.01018: done transferring module to remote 8238 1726882410.01041: _low_level_execute_command(): starting 8238 1726882410.01045: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/ /root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/AnsiballZ_command.py && sleep 0' 8238 1726882410.01771: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882410.01775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 8238 1726882410.01791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882410.01809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882410.01816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882410.01820: stderr chunk (state=3): >>>debug2: match found <<< 8238 1726882410.01836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882410.01937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882410.02033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882410.04080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882410.04083: stdout chunk (state=3): >>><<< 8238 1726882410.04086: stderr chunk (state=3): >>><<< 8238 1726882410.04088: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882410.04090: _low_level_execute_command(): starting 8238 1726882410.04093: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/AnsiballZ_command.py && sleep 0' 8238 1726882410.05065: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882410.05081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882410.05102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882410.05177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882410.05232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882410.05258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8238 1726882410.05286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882410.05436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882410.23084: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:19:da:ea:a3:f3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.226/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3093sec preferred_lft 3093sec\n inet6 fe80::19:daff:feea:a3f3/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.226 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.226 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:33:30.219479", "end": "2024-09-20 21:33:30.228901", "delta": "0:00:00.009422", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8238 1726882410.24803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882410.24807: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 8238 1726882410.24811: stdout chunk (state=3): >>><<< 8238 1726882410.24813: stderr chunk (state=3): >>><<< 8238 1726882410.24843: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:19:da:ea:a3:f3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.226/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3093sec preferred_lft 3093sec\n inet6 fe80::19:daff:feea:a3f3/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.226 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.226 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:33:30.219479", "end": "2024-09-20 21:33:30.228901", "delta": "0:00:00.009422", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 8238 1726882410.24929: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8238 1726882410.24933: _low_level_execute_command(): starting 8238 1726882410.25010: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882409.918353-9852-188076930839868/ > /dev/null 2>&1 && sleep 0' 8238 1726882410.25668: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8238 1726882410.25683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8238 1726882410.25699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8238 1726882410.25720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8238 1726882410.25740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 8238 1726882410.25753: stderr chunk (state=3): >>>debug2: match not found <<< 8238 1726882410.25773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882410.25795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8238 1726882410.25808: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 8238 1726882410.25839: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8238 1726882410.25927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 8238 1726882410.25945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8238 1726882410.26064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8238 1726882410.28063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8238 1726882410.28137: stdout chunk (state=3): >>><<< 8238 1726882410.28140: stderr chunk (state=3): >>><<< 8238 1726882410.28193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8238 1726882410.28311: handler run complete 8238 1726882410.28314: Evaluated conditional (False): False 8238 1726882410.28317: attempt loop complete, returning result 8238 1726882410.28320: _execute() done 8238 1726882410.28324: dumping result to json 8238 1726882410.28346: done dumping result, returning 8238 1726882410.28378: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0affc7ec-ae25-54bc-d334-00000000056d] 8238 1726882410.28416: sending task result for task 0affc7ec-ae25-54bc-d334-00000000056d ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009422", "end": "2024-09-20 21:33:30.228901", "rc": 0, "start": "2024-09-20 21:33:30.219479" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:19:da:ea:a3:f3 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.226/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3093sec preferred_lft 3093sec inet6 fe80::19:daff:feea:a3f3/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.226 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.226 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 8238 1726882410.28815: no more pending results, returning what we have 8238 1726882410.28819: results queue empty 8238 1726882410.28820: checking for any_errors_fatal 8238 1726882410.28824: done checking for any_errors_fatal 8238 1726882410.28825: checking for max_fail_percentage 8238 1726882410.28827: done checking for max_fail_percentage 8238 1726882410.28828: checking to see if all hosts have failed and the running result is not ok 8238 1726882410.28829: done checking to see if all hosts have failed 8238 1726882410.28830: getting the remaining hosts for this loop 8238 1726882410.28832: done getting the remaining hosts for this loop 8238 1726882410.28841: getting the next task for host managed_node3 8238 1726882410.28850: done getting next task for host managed_node3 8238 1726882410.28854: ^ task is: TASK: Verify DNS and network connectivity 8238 1726882410.28861: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8238 1726882410.28867: getting variables 8238 1726882410.28869: in VariableManager get_vars() 8238 1726882410.28914: Calling all_inventory to load vars for managed_node3 8238 1726882410.28918: Calling groups_inventory to load vars for managed_node3 8238 1726882410.28920: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882410.29237: Calling all_plugins_play to load vars for managed_node3 8238 1726882410.29242: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882410.29249: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000056d 8238 1726882410.29252: WORKER PROCESS EXITING 8238 1726882410.29255: Calling groups_plugins_play to load vars for managed_node3 8238 1726882410.32390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882410.35405: done with get_vars() 8238 1726882410.35439: done getting variables 8238 1726882410.35515: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:33:30 -0400 (0:00:00.498) 0:00:40.510 ****** 8238 1726882410.35552: entering _queue_task() for managed_node3/shell 8238 1726882410.35933: worker is 1 (out of 1 available) 8238 1726882410.35947: exiting _queue_task() for managed_node3/shell 8238 1726882410.35964: done queuing things up, now waiting for results queue to drain 8238 1726882410.35965: waiting for pending results... 8238 1726882410.36297: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 8238 1726882410.36613: in run() - task 0affc7ec-ae25-54bc-d334-00000000056e 8238 1726882410.36618: variable 'ansible_search_path' from source: unknown 8238 1726882410.36621: variable 'ansible_search_path' from source: unknown 8238 1726882410.36626: calling self._execute() 8238 1726882410.36716: variable 'ansible_host' from source: host vars for 'managed_node3' 8238 1726882410.36720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8238 1726882410.36733: variable 'omit' from source: magic vars 8238 1726882410.37394: variable 'ansible_distribution_major_version' from source: facts 8238 1726882410.37398: Evaluated conditional (ansible_distribution_major_version != '6'): True 8238 1726882410.37541: variable 'ansible_facts' from source: unknown 8238 1726882410.38692: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 8238 1726882410.38696: when evaluation is False, skipping this task 8238 1726882410.38700: _execute() done 8238 1726882410.38702: dumping result to json 8238 1726882410.38705: done dumping result, returning 8238 1726882410.38712: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0affc7ec-ae25-54bc-d334-00000000056e] 8238 1726882410.38723: sending task result for task 0affc7ec-ae25-54bc-d334-00000000056e 8238 1726882410.38836: done sending task result for task 0affc7ec-ae25-54bc-d334-00000000056e 8238 1726882410.38839: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 8238 1726882410.38921: no more pending results, returning what we have 8238 1726882410.38928: results queue empty 8238 1726882410.38929: checking for any_errors_fatal 8238 1726882410.38940: done checking for any_errors_fatal 8238 1726882410.38941: checking for max_fail_percentage 8238 1726882410.38942: done checking for max_fail_percentage 8238 1726882410.38944: checking to see if all hosts have failed and the running result is not ok 8238 1726882410.38945: done checking to see if all hosts have failed 8238 1726882410.38945: getting the remaining hosts for this loop 8238 1726882410.38947: done getting the remaining hosts for this loop 8238 1726882410.38952: getting the next task for host managed_node3 8238 1726882410.38969: done getting next task for host managed_node3 8238 1726882410.38972: ^ task is: TASK: meta (flush_handlers) 8238 1726882410.38977: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882410.38983: getting variables 8238 1726882410.38985: in VariableManager get_vars() 8238 1726882410.39239: Calling all_inventory to load vars for managed_node3 8238 1726882410.39242: Calling groups_inventory to load vars for managed_node3 8238 1726882410.39245: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882410.39260: Calling all_plugins_play to load vars for managed_node3 8238 1726882410.39263: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882410.39266: Calling groups_plugins_play to load vars for managed_node3 8238 1726882410.41487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882410.44787: done with get_vars() 8238 1726882410.44851: done getting variables 8238 1726882410.44976: in VariableManager get_vars() 8238 1726882410.45004: Calling all_inventory to load vars for managed_node3 8238 1726882410.45007: Calling groups_inventory to load vars for managed_node3 8238 1726882410.45010: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882410.45016: Calling all_plugins_play to load vars for managed_node3 8238 1726882410.45019: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882410.45024: Calling groups_plugins_play to load vars for managed_node3 8238 1726882410.46944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882410.49324: done with get_vars() 8238 1726882410.49366: done queuing things up, now waiting for results queue to drain 8238 1726882410.49368: results queue empty 8238 1726882410.49369: checking for any_errors_fatal 8238 1726882410.49372: done checking for any_errors_fatal 8238 1726882410.49373: checking for max_fail_percentage 8238 1726882410.49374: done checking for max_fail_percentage 8238 1726882410.49375: checking to see if all hosts have failed and the running result is not ok 8238 1726882410.49376: done checking to see if all hosts have failed 8238 1726882410.49376: getting the remaining hosts for this loop 8238 1726882410.49377: done getting the remaining hosts for this loop 8238 1726882410.49380: getting the next task for host managed_node3 8238 1726882410.49384: done getting next task for host managed_node3 8238 1726882410.49385: ^ task is: TASK: meta (flush_handlers) 8238 1726882410.49387: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882410.49389: getting variables 8238 1726882410.49390: in VariableManager get_vars() 8238 1726882410.49405: Calling all_inventory to load vars for managed_node3 8238 1726882410.49407: Calling groups_inventory to load vars for managed_node3 8238 1726882410.49409: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882410.49419: Calling all_plugins_play to load vars for managed_node3 8238 1726882410.49421: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882410.49426: Calling groups_plugins_play to load vars for managed_node3 8238 1726882410.51219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882410.53457: done with get_vars() 8238 1726882410.53487: done getting variables 8238 1726882410.53548: in VariableManager get_vars() 8238 1726882410.53566: Calling all_inventory to load vars for managed_node3 8238 1726882410.53569: Calling groups_inventory to load vars for managed_node3 8238 1726882410.53572: Calling all_plugins_inventory to load vars for managed_node3 8238 1726882410.53577: Calling all_plugins_play to load vars for managed_node3 8238 1726882410.53580: Calling groups_plugins_inventory to load vars for managed_node3 8238 1726882410.53583: Calling groups_plugins_play to load vars for managed_node3 8238 1726882410.55015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8238 1726882410.57104: done with get_vars() 8238 1726882410.57135: done queuing things up, now waiting for results queue to drain 8238 1726882410.57138: results queue empty 8238 1726882410.57139: checking for any_errors_fatal 8238 1726882410.57140: done checking for any_errors_fatal 8238 1726882410.57141: checking for max_fail_percentage 8238 1726882410.57142: done checking for max_fail_percentage 8238 1726882410.57143: checking to see if all hosts have failed and the running result is not ok 8238 1726882410.57144: done checking to see if all hosts have failed 8238 1726882410.57144: getting the remaining hosts for this loop 8238 1726882410.57146: done getting the remaining hosts for this loop 8238 1726882410.57154: getting the next task for host managed_node3 8238 1726882410.57158: done getting next task for host managed_node3 8238 1726882410.57159: ^ task is: None 8238 1726882410.57161: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8238 1726882410.57162: done queuing things up, now waiting for results queue to drain 8238 1726882410.57163: results queue empty 8238 1726882410.57164: checking for any_errors_fatal 8238 1726882410.57164: done checking for any_errors_fatal 8238 1726882410.57165: checking for max_fail_percentage 8238 1726882410.57166: done checking for max_fail_percentage 8238 1726882410.57167: checking to see if all hosts have failed and the running result is not ok 8238 1726882410.57168: done checking to see if all hosts have failed 8238 1726882410.57170: getting the next task for host managed_node3 8238 1726882410.57172: done getting next task for host managed_node3 8238 1726882410.57173: ^ task is: None 8238 1726882410.57174: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=75 changed=2 unreachable=0 failed=0 skipped=61 rescued=0 ignored=0 Friday 20 September 2024 21:33:30 -0400 (0:00:00.217) 0:00:40.727 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 3.33s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 2.52s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gather the minimum subset of ansible_facts required by the network role test --- 2.51s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.48s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.39s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Create test interfaces -------------------------------------------------- 1.82s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.51s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install dnsmasq --------------------------------------------------------- 1.46s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Install pgrep, sysctl --------------------------------------------------- 1.36s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.16s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.02s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.01s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.00s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Check if system is ostree ----------------------------------------------- 0.65s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.64s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.57s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Remove test interfaces -------------------------------------------------- 0.53s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Stat profile file ------------------------------------------------------- 0.51s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Check routes and DNS ---------------------------------------------------- 0.50s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Stat profile file ------------------------------------------------------- 0.50s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 8238 1726882410.57296: RUNNING CLEANUP